• DOMAIN: Electronics and Telecommunication
• CONTEXT: A communications equipment manufacturing company has a product which is responsible for emitting informative signals.
Company wants to build a machine learning model which can help the company to predict the equipment’s signal quality using various
parameters.
• DATA DESCRIPTION: The data set contains information on various signal tests performed:
# Importing required libraries
import numpy as np
import pandas as pd
import seaborn as sns
import scipy.stats as stats
import matplotlib.pyplot as plt
from tensorflow import keras
#from keras.models import Sequential
#from keras.layers import Dense
#from sklearn.model_selection import StratifiedKFold
%matplotlib inline
#Test Train Split
from sklearn.model_selection import train_test_split
#Feature Scaling library
from sklearn.preprocessing import StandardScaler
#import pickle
from tensorflow.keras.utils import to_categorical
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Flatten, Dense
from tensorflow.keras import regularizers, op timizers
from sklearn.metrics import r2_score
from tensorflow.keras.models import load_model
# Initialize the random number generator
import random
seed = 7
np.random.seed(seed)
# Ignore the warnings
import warnings
warnings.filterwarnings("ignore")
# read csv file
mydata=pd.read_csv("PartSignal.csv")
# Display top 5 rows of the dataset
mydata.head()
| Parameter 1 | Parameter 2 | Parameter 3 | Parameter 4 | Parameter 5 | Parameter 6 | Parameter 7 | Parameter 8 | Parameter 9 | Parameter 10 | Parameter 11 | Signal_Strength | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 7.4 | 0.70 | 0.00 | 1.9 | 0.076 | 11.0 | 34.0 | 0.9978 | 3.51 | 0.56 | 9.4 | 5 |
| 1 | 7.8 | 0.88 | 0.00 | 2.6 | 0.098 | 25.0 | 67.0 | 0.9968 | 3.20 | 0.68 | 9.8 | 5 |
| 2 | 7.8 | 0.76 | 0.04 | 2.3 | 0.092 | 15.0 | 54.0 | 0.9970 | 3.26 | 0.65 | 9.8 | 5 |
| 3 | 11.2 | 0.28 | 0.56 | 1.9 | 0.075 | 17.0 | 60.0 | 0.9980 | 3.16 | 0.58 | 9.8 | 6 |
| 4 | 7.4 | 0.70 | 0.00 | 1.9 | 0.076 | 11.0 | 34.0 | 0.9978 | 3.51 | 0.56 | 9.4 | 5 |
Data analysis & visualisation
# Shape of the data
mydata.shape
(1599, 12)
# Data type of each attribute
mydata.info() # it gives information about the data and data types of each attribute
<class 'pandas.core.frame.DataFrame'> RangeIndex: 1599 entries, 0 to 1598 Data columns (total 12 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Parameter 1 1599 non-null float64 1 Parameter 2 1599 non-null float64 2 Parameter 3 1599 non-null float64 3 Parameter 4 1599 non-null float64 4 Parameter 5 1599 non-null float64 5 Parameter 6 1599 non-null float64 6 Parameter 7 1599 non-null float64 7 Parameter 8 1599 non-null float64 8 Parameter 9 1599 non-null float64 9 Parameter 10 1599 non-null float64 10 Parameter 11 1599 non-null float64 11 Signal_Strength 1599 non-null int64 dtypes: float64(11), int64(1) memory usage: 150.0 KB
All the parameters are floating point and the signal strength is an integer.
Apart from Signal Strength rest all features are floating point.
# Checking the presence of missing values
null_counts = mydata.isnull().sum() # This prints the columns with the number of null values they have
print (null_counts)
Parameter 1 0 Parameter 2 0 Parameter 3 0 Parameter 4 0 Parameter 5 0 Parameter 6 0 Parameter 7 0 Parameter 8 0 Parameter 9 0 Parameter 10 0 Parameter 11 0 Signal_Strength 0 dtype: int64
There are no null values in the data
# 5 point summary of numerical attributes
mydata.describe()
| Parameter 1 | Parameter 2 | Parameter 3 | Parameter 4 | Parameter 5 | Parameter 6 | Parameter 7 | Parameter 8 | Parameter 9 | Parameter 10 | Parameter 11 | Signal_Strength | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 1599.000000 | 1599.000000 | 1599.000000 | 1599.000000 | 1599.000000 | 1599.000000 | 1599.000000 | 1599.000000 | 1599.000000 | 1599.000000 | 1599.000000 | 1599.000000 |
| mean | 8.319637 | 0.527821 | 0.270976 | 2.538806 | 0.087467 | 15.874922 | 46.467792 | 0.996747 | 3.311113 | 0.658149 | 10.422983 | 5.636023 |
| std | 1.741096 | 0.179060 | 0.194801 | 1.409928 | 0.047065 | 10.460157 | 32.895324 | 0.001887 | 0.154386 | 0.169507 | 1.065668 | 0.807569 |
| min | 4.600000 | 0.120000 | 0.000000 | 0.900000 | 0.012000 | 1.000000 | 6.000000 | 0.990070 | 2.740000 | 0.330000 | 8.400000 | 3.000000 |
| 25% | 7.100000 | 0.390000 | 0.090000 | 1.900000 | 0.070000 | 7.000000 | 22.000000 | 0.995600 | 3.210000 | 0.550000 | 9.500000 | 5.000000 |
| 50% | 7.900000 | 0.520000 | 0.260000 | 2.200000 | 0.079000 | 14.000000 | 38.000000 | 0.996750 | 3.310000 | 0.620000 | 10.200000 | 6.000000 |
| 75% | 9.200000 | 0.640000 | 0.420000 | 2.600000 | 0.090000 | 21.000000 | 62.000000 | 0.997835 | 3.400000 | 0.730000 | 11.100000 | 6.000000 |
| max | 15.900000 | 1.580000 | 1.000000 | 15.500000 | 0.611000 | 72.000000 | 289.000000 | 1.003690 | 4.010000 | 2.000000 | 14.900000 | 8.000000 |
Looking the 11 parameters : Parameter 3 ranges between 0 and 1. Maximum value of Parameter 5 is 0.6 Parameter 8 has a very low range between 0.9 and 1.004 Standard deviation is lowest for Parameter 8, it is 0.001887 'Signal_Strength' has classes as - 3.5, 4.0,5.0, 6.0, 7.0 and 7.5
# studying the distribution of continuous attributes
cols = list(mydata)
for i in np.arange(len(cols)):
sns.distplot(mydata[cols[i]], color='blue')
#plt.xlabel('Experience')
plt.show()
print('Distribution of ',cols[i])
print('Mean is:',mydata[cols[i]].mean())
print('Median is:',mydata[cols[i]].median())
print('Mode is:',mydata[cols[i]].mode())
print('Standard deviation is:',mydata[cols[i]].std())
print('Skewness is:',mydata[cols[i]].skew())
print('Maximum is:',mydata[cols[i]].max())
print('Minimum is:',mydata[cols[i]].min())
Distribution of Parameter 1 Mean is: 8.319637273295838 Median is: 7.9 Mode is: 0 7.2 Name: Parameter 1, dtype: float64 Standard deviation is: 1.7410963181277006 Skewness is: 0.9827514413284587 Maximum is: 15.9 Minimum is: 4.6
Distribution of Parameter 2 Mean is: 0.5278205128205131 Median is: 0.52 Mode is: 0 0.6 Name: Parameter 2, dtype: float64 Standard deviation is: 0.17905970415353498 Skewness is: 0.6715925723840199 Maximum is: 1.58 Minimum is: 0.12
Distribution of Parameter 3 Mean is: 0.2709756097560964 Median is: 0.26 Mode is: 0 0.0 Name: Parameter 3, dtype: float64 Standard deviation is: 0.19480113740531785 Skewness is: 0.3183372952546368 Maximum is: 1.0 Minimum is: 0.0
Distribution of Parameter 4 Mean is: 2.5388055034396517 Median is: 2.2 Mode is: 0 2.0 Name: Parameter 4, dtype: float64 Standard deviation is: 1.4099280595072805 Skewness is: 4.54065542590319 Maximum is: 15.5 Minimum is: 0.9
Distribution of Parameter 5 Mean is: 0.08746654158849257 Median is: 0.079 Mode is: 0 0.08 Name: Parameter 5, dtype: float64 Standard deviation is: 0.047065302010090154 Skewness is: 5.680346571971724 Maximum is: 0.611 Minimum is: 0.012
Distribution of Parameter 6 Mean is: 15.874921826141339 Median is: 14.0 Mode is: 0 6.0 Name: Parameter 6, dtype: float64 Standard deviation is: 10.46015696980973 Skewness is: 1.250567293314441 Maximum is: 72.0 Minimum is: 1.0
Distribution of Parameter 7 Mean is: 46.46779237023139 Median is: 38.0 Mode is: 0 28.0 Name: Parameter 7, dtype: float64 Standard deviation is: 32.89532447829901 Skewness is: 1.515531257594554 Maximum is: 289.0 Minimum is: 6.0
Distribution of Parameter 8 Mean is: 0.9967466791744831 Median is: 0.99675 Mode is: 0 0.9972 Name: Parameter 8, dtype: float64 Standard deviation is: 0.0018873339538425559 Skewness is: 0.07128766294927483 Maximum is: 1.00369 Minimum is: 0.99007
Distribution of Parameter 9 Mean is: 3.311113195747343 Median is: 3.31 Mode is: 0 3.3 Name: Parameter 9, dtype: float64 Standard deviation is: 0.15438646490354266 Skewness is: 0.19368349811284427 Maximum is: 4.01 Minimum is: 2.74
Distribution of Parameter 10 Mean is: 0.6581488430268921 Median is: 0.62 Mode is: 0 0.6 Name: Parameter 10, dtype: float64 Standard deviation is: 0.16950697959010977 Skewness is: 2.4286723536602945 Maximum is: 2.0 Minimum is: 0.33
Distribution of Parameter 11 Mean is: 10.422983114446502 Median is: 10.2 Mode is: 0 9.5 Name: Parameter 11, dtype: float64 Standard deviation is: 1.0656675818563965 Skewness is: 0.8608288069184189 Maximum is: 14.9 Minimum is: 8.4
Distribution of Signal_Strength Mean is: 5.6360225140712945 Median is: 6.0 Mode is: 0 5 Name: Signal_Strength, dtype: int64 Standard deviation is: 0.8075694397347023 Skewness is: 0.21780157547366327 Maximum is: 8 Minimum is: 3
Mean, median and mode are almost overlapping or too close to each other ecept in Parameter 7 Parameter 3 is trimodal and Signal strength is a classification variable. All of them are positively skewed. Standard deviation is maximum for Parameter7, it is 32.895324478299074
sns.countplot(mydata['Signal_Strength']) # Distibution of the column 'Signal_Strength'
plt.show()
class 5.0 in 'Signal_Strength' has the highest count.
#plt.figure(figsize = (50,50))
sns.pairplot(mydata,diag_kind='kde')
plt.show()
1.Parameter 6 and Parameter 7 are highly correlated with each other and visce versa and they have almost 0 correlation with other Parameters 2.Parameter 1 is positively correlated to Parameter 3 and Parameter 8 and negatively correlated to Parameter 2 and Parameter 9. 3.Parameter 4 is has very low correlation with other Parameters.
# Checking the presence of outliers
l = len(mydata)
col = list(mydata.columns)
#col.remove('condition')
for i in np.arange(len(col)):
sns.boxplot(x= mydata[col[i]], color='cyan')
plt.show()
print('Boxplot of ',col[i])
#calculating the outiers in attribute
Q1 = mydata[col[i]].quantile(0.25)
Q2 = mydata[col[i]].quantile(0.50)
Q3 = mydata[col[i]].quantile(0.75)
IQR = Q3 - Q1
L_W = (Q1 - 1.5 *IQR)
U_W = (Q3 + 1.5 *IQR)
print('Q1 is : ',Q1)
print('Q2 is : ',Q2)
print('Q3 is : ',Q3)
print('IQR is:',IQR)
print('Lower Whisker, Upper Whisker : ',L_W,',',U_W)
bools = (mydata[col[i]] < (Q1 - 1.5 *IQR)) |(mydata[col[i]] > (Q3 + 1.5 * IQR))
print('Out of ',l,' rows in data, number of outliers are:',bools.sum()) #calculating the number of outliers
Boxplot of Parameter 1 Q1 is : 7.1 Q2 is : 7.9 Q3 is : 9.2 IQR is: 2.0999999999999996 Lower Whisker, Upper Whisker : 3.95 , 12.349999999999998 Out of 1599 rows in data, number of outliers are: 49
Boxplot of Parameter 2 Q1 is : 0.39 Q2 is : 0.52 Q3 is : 0.64 IQR is: 0.25 Lower Whisker, Upper Whisker : 0.015000000000000013 , 1.0150000000000001 Out of 1599 rows in data, number of outliers are: 19
Boxplot of Parameter 3 Q1 is : 0.09 Q2 is : 0.26 Q3 is : 0.42 IQR is: 0.32999999999999996 Lower Whisker, Upper Whisker : -0.4049999999999999 , 0.9149999999999999 Out of 1599 rows in data, number of outliers are: 1
Boxplot of Parameter 4 Q1 is : 1.9 Q2 is : 2.2 Q3 is : 2.6 IQR is: 0.7000000000000002 Lower Whisker, Upper Whisker : 0.8499999999999996 , 3.6500000000000004 Out of 1599 rows in data, number of outliers are: 155
Boxplot of Parameter 5 Q1 is : 0.07 Q2 is : 0.079 Q3 is : 0.09 IQR is: 0.01999999999999999 Lower Whisker, Upper Whisker : 0.04000000000000002 , 0.11999999999999998 Out of 1599 rows in data, number of outliers are: 112
Boxplot of Parameter 6 Q1 is : 7.0 Q2 is : 14.0 Q3 is : 21.0 IQR is: 14.0 Lower Whisker, Upper Whisker : -14.0 , 42.0 Out of 1599 rows in data, number of outliers are: 30
Boxplot of Parameter 7 Q1 is : 22.0 Q2 is : 38.0 Q3 is : 62.0 IQR is: 40.0 Lower Whisker, Upper Whisker : -38.0 , 122.0 Out of 1599 rows in data, number of outliers are: 55
Boxplot of Parameter 8 Q1 is : 0.9956 Q2 is : 0.99675 Q3 is : 0.997835 IQR is: 0.002234999999999987 Lower Whisker, Upper Whisker : 0.9922475000000001 , 1.0011875 Out of 1599 rows in data, number of outliers are: 45
Boxplot of Parameter 9 Q1 is : 3.21 Q2 is : 3.31 Q3 is : 3.4 IQR is: 0.18999999999999995 Lower Whisker, Upper Whisker : 2.925 , 3.6849999999999996 Out of 1599 rows in data, number of outliers are: 35
Boxplot of Parameter 10 Q1 is : 0.55 Q2 is : 0.62 Q3 is : 0.73 IQR is: 0.17999999999999994 Lower Whisker, Upper Whisker : 0.28000000000000014 , 0.9999999999999999 Out of 1599 rows in data, number of outliers are: 59
Boxplot of Parameter 11 Q1 is : 9.5 Q2 is : 10.2 Q3 is : 11.1 IQR is: 1.5999999999999996 Lower Whisker, Upper Whisker : 7.1000000000000005 , 13.5 Out of 1599 rows in data, number of outliers are: 13
Boxplot of Signal_Strength Q1 is : 5.0 Q2 is : 6.0 Q3 is : 6.0 IQR is: 1.0 Lower Whisker, Upper Whisker : 3.5 , 7.5 Out of 1599 rows in data, number of outliers are: 28
Parameter 4 has the highest number of outliers which is 155.
# function to treat outliers
def detect_treate_outliers(df,operation):
cols=[]
IQR_list=[]
lower_boundary_list=[]
upper_boundary_list=[]
outliers_count=[]
for col in df.columns:
#print('col',col)
if((df[col].dtype =='int64' or df[col].dtype =='float64') and (col != 'HR')):
#print('Inside if')
IQR = df[col].quantile(0.75) - df[col].quantile(0.25)
lower_boundary = df[col].quantile(0.25) - (1.5 * IQR)
upper_boundary = df[col].quantile(0.75) + (1.5 * IQR)
up_cnt = df[df[col]>upper_boundary][col].shape[0]
#print('Upper count=',up_cnt)
lw_cnt = df[df[col]<lower_boundary][col].shape[0]
#print('lower count=',lw_cnt)
if(up_cnt+lw_cnt) > 0:
cols.append(col)
IQR_list.append(IQR)
lower_boundary_list.append(lower_boundary)
upper_boundary_list.append(upper_boundary)
outliers_count.append(up_cnt+lw_cnt)
if operation == 'update':
df.loc[df[col] > upper_boundary,col] = upper_boundary
df.loc[df[col] < lower_boundary,col] = lower_boundary
else:
pass
else:
pass
#print('cols=',cols)
# print('IQR_list=',IQR_list)
# print('lower_boundary_list=',lower_boundary_list)
# print('upper_boundary_list=',upper_boundary_list)
# print('outliers_count=',outliers_count)
ndf = pd.DataFrame(list(zip(cols,IQR_list,lower_boundary_list,upper_boundary_list,outliers_count)),columns=['Features','IQR','Lower Boundary','Upper Boundary','Outlier Count'])
#print('Data=',ndf)
#print('Columns having outliers=',cols)
if operation == 'update':
return (len(cols),df)
else:
return (len(cols),ndf)
#Removing outliers by replacing the data below lower whisker with it and above upper whisker with it respectively.
count,df=detect_treate_outliers(mydata,'update')
if count>0:
print('Updating dataset')
mydata=df
Updating dataset
# studying correlation between the attributes
b_corr=mydata.corr()
plt.subplots(figsize =(12, 7))
sns.heatmap(b_corr,annot=True)
<AxesSubplot:>
Since high correlation coefficient value lies between ± 0.50 and ± 1 Parameter 1 is highly correlated with Parameter 3 and Parameter 8, Parameter 9. Parameter 6 and 7 are highly correlated. But since, the correlation is not too high near 0.8 or above not dropping the features.
Design, train, tune and test a neural network regressor.
X = mydata.drop("Signal_Strength", axis=1)
y = mydata['Signal_Strength']
from sklearn.model_selection import train_test_split
# splitting to create test data
X_vtrain, X_test, y_vtrain, y_test = train_test_split(X, y, test_size=.30, random_state=seed)
X_vtrain.shape
(1119, 11)
# splitting to create training and validation data
X_train, X_val, y_train, y_val = train_test_split(X_vtrain, y_vtrain, test_size=.20, random_state=seed)
X_train.shape
(895, 11)
# Initialize Sequential model
model_reg = tf.keras.models.Sequential()
# Normalize input data
model_reg.add(tf.keras.layers.BatchNormalization(input_shape=(11,)))
# Add final Dense layer for prediction - Tensorflow.keras declares weights and bias automatically
model_reg.add(tf.keras.layers.Dense(1))
# Compile the model - add mean squared error as loss and stochastic gradient descent as optimizer
model_reg.compile(optimizer='sgd', loss='mse')
model_reg.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=100, batch_size=10)
Epoch 1/100 90/90 [==============================] - 0s 2ms/step - loss: 4.2097 - val_loss: 3.3021 Epoch 2/100 90/90 [==============================] - 0s 786us/step - loss: 0.9264 - val_loss: 1.2704 Epoch 3/100 90/90 [==============================] - 0s 789us/step - loss: 0.7334 - val_loss: 0.7663 Epoch 4/100 90/90 [==============================] - 0s 792us/step - loss: 0.5899 - val_loss: 0.4451 Epoch 5/100 90/90 [==============================] - 0s 792us/step - loss: 0.5296 - val_loss: 0.3390 Epoch 6/100 90/90 [==============================] - 0s 804us/step - loss: 0.4793 - val_loss: 0.3589 Epoch 7/100 90/90 [==============================] - 0s 785us/step - loss: 0.4926 - val_loss: 0.3299 Epoch 8/100 90/90 [==============================] - 0s 763us/step - loss: 0.4911 - val_loss: 0.3305 Epoch 9/100 90/90 [==============================] - 0s 847us/step - loss: 0.4670 - val_loss: 0.3315 Epoch 10/100 90/90 [==============================] - 0s 844us/step - loss: 0.4664 - val_loss: 0.3507 Epoch 11/100 90/90 [==============================] - 0s 811us/step - loss: 0.4492 - val_loss: 0.3349 Epoch 12/100 90/90 [==============================] - 0s 813us/step - loss: 0.4375 - val_loss: 0.3339 Epoch 13/100 90/90 [==============================] - 0s 827us/step - loss: 0.4409 - val_loss: 0.3379 Epoch 14/100 90/90 [==============================] - 0s 802us/step - loss: 0.4416 - val_loss: 0.3722 Epoch 15/100 90/90 [==============================] - 0s 762us/step - loss: 0.4425 - val_loss: 0.3607 Epoch 16/100 90/90 [==============================] - 0s 783us/step - loss: 0.4550 - val_loss: 0.3523 Epoch 17/100 90/90 [==============================] - 0s 787us/step - loss: 0.4342 - val_loss: 0.3618 Epoch 18/100 90/90 [==============================] - 0s 781us/step - loss: 0.4237 - val_loss: 0.3437 Epoch 19/100 90/90 [==============================] - 0s 803us/step - loss: 0.4413 - val_loss: 0.3456 Epoch 20/100 90/90 [==============================] - 0s 778us/step - loss: 0.4363 - val_loss: 0.3478 Epoch 21/100 90/90 [==============================] - 0s 793us/step - loss: 0.4405 - val_loss: 0.3528 Epoch 22/100 90/90 [==============================] - 0s 786us/step - loss: 0.4439 - val_loss: 0.3506 Epoch 23/100 90/90 [==============================] - 0s 972us/step - loss: 0.4432 - val_loss: 0.3494 Epoch 24/100 90/90 [==============================] - 0s 908us/step - loss: 0.4357 - val_loss: 0.3599 Epoch 25/100 90/90 [==============================] - 0s 781us/step - loss: 0.4356 - val_loss: 0.3506 Epoch 26/100 90/90 [==============================] - 0s 747us/step - loss: 0.4458 - val_loss: 0.3494 Epoch 27/100 90/90 [==============================] - 0s 759us/step - loss: 0.4379 - val_loss: 0.4128 Epoch 28/100 90/90 [==============================] - 0s 757us/step - loss: 0.4288 - val_loss: 0.3478 Epoch 29/100 90/90 [==============================] - 0s 743us/step - loss: 0.4416 - val_loss: 0.3470 Epoch 30/100 90/90 [==============================] - 0s 794us/step - loss: 0.4368 - val_loss: 0.3598 Epoch 31/100 90/90 [==============================] - 0s 932us/step - loss: 0.4512 - val_loss: 0.4217 Epoch 32/100 90/90 [==============================] - 0s 894us/step - loss: 0.4362 - val_loss: 0.3531 Epoch 33/100 90/90 [==============================] - 0s 883us/step - loss: 0.4490 - val_loss: 0.3477 Epoch 34/100 90/90 [==============================] - 0s 840us/step - loss: 0.4353 - val_loss: 0.3513 Epoch 35/100 90/90 [==============================] - 0s 870us/step - loss: 0.4498 - val_loss: 0.4138 Epoch 36/100 90/90 [==============================] - 0s 1ms/step - loss: 0.4594 - val_loss: 0.3941 Epoch 37/100 90/90 [==============================] - 0s 899us/step - loss: 0.4377 - val_loss: 0.3489 Epoch 38/100 90/90 [==============================] - 0s 861us/step - loss: 0.4334 - val_loss: 0.3625 Epoch 39/100 90/90 [==============================] - 0s 819us/step - loss: 0.4409 - val_loss: 0.3587 Epoch 40/100 90/90 [==============================] - 0s 830us/step - loss: 0.4360 - val_loss: 0.3898 Epoch 41/100 90/90 [==============================] - 0s 812us/step - loss: 0.4457 - val_loss: 0.3796 Epoch 42/100 90/90 [==============================] - 0s 778us/step - loss: 0.4468 - val_loss: 0.4658 Epoch 43/100 90/90 [==============================] - 0s 761us/step - loss: 0.4444 - val_loss: 0.3788 Epoch 44/100 90/90 [==============================] - 0s 757us/step - loss: 0.4166 - val_loss: 0.3657 Epoch 45/100 90/90 [==============================] - 0s 743us/step - loss: 0.4396 - val_loss: 0.3517 Epoch 46/100 90/90 [==============================] - 0s 782us/step - loss: 0.4410 - val_loss: 0.3504 Epoch 47/100 90/90 [==============================] - 0s 785us/step - loss: 0.4517 - val_loss: 0.3541 Epoch 48/100 90/90 [==============================] - 0s 785us/step - loss: 0.4435 - val_loss: 0.3610 Epoch 49/100 90/90 [==============================] - 0s 935us/step - loss: 0.4253 - val_loss: 0.3474 Epoch 50/100 90/90 [==============================] - 0s 841us/step - loss: 0.4194 - val_loss: 0.4088 Epoch 51/100 90/90 [==============================] - 0s 757us/step - loss: 0.4324 - val_loss: 0.3761 Epoch 52/100 90/90 [==============================] - 0s 753us/step - loss: 0.4283 - val_loss: 0.3935 Epoch 53/100 90/90 [==============================] - 0s 770us/step - loss: 0.4250 - val_loss: 0.3540 Epoch 54/100 90/90 [==============================] - 0s 760us/step - loss: 0.4296 - val_loss: 0.3511 Epoch 55/100 90/90 [==============================] - 0s 756us/step - loss: 0.4417 - val_loss: 0.3547 Epoch 56/100 90/90 [==============================] - 0s 792us/step - loss: 0.4363 - val_loss: 0.3642 Epoch 57/100 90/90 [==============================] - 0s 773us/step - loss: 0.4329 - val_loss: 0.3601 Epoch 58/100 90/90 [==============================] - 0s 745us/step - loss: 0.4479 - val_loss: 0.3506 Epoch 59/100 90/90 [==============================] - 0s 741us/step - loss: 0.4187 - val_loss: 0.3653 Epoch 60/100 90/90 [==============================] - 0s 754us/step - loss: 0.4399 - val_loss: 0.3698 Epoch 61/100 90/90 [==============================] - 0s 778us/step - loss: 0.4552 - val_loss: 0.3635 Epoch 62/100 90/90 [==============================] - 0s 749us/step - loss: 0.4446 - val_loss: 0.3563 Epoch 63/100 90/90 [==============================] - 0s 741us/step - loss: 0.4251 - val_loss: 0.3590 Epoch 64/100 90/90 [==============================] - 0s 762us/step - loss: 0.4377 - val_loss: 0.3533 Epoch 65/100 90/90 [==============================] - 0s 760us/step - loss: 0.4157 - val_loss: 0.3661 Epoch 66/100 90/90 [==============================] - 0s 783us/step - loss: 0.4287 - val_loss: 0.3546 Epoch 67/100 90/90 [==============================] - 0s 734us/step - loss: 0.4306 - val_loss: 0.3580 Epoch 68/100 90/90 [==============================] - 0s 756us/step - loss: 0.4529 - val_loss: 0.3890 Epoch 69/100 90/90 [==============================] - 0s 745us/step - loss: 0.4413 - val_loss: 0.3575 Epoch 70/100 90/90 [==============================] - 0s 753us/step - loss: 0.4504 - val_loss: 0.3510 Epoch 71/100 90/90 [==============================] - 0s 764us/step - loss: 0.4373 - val_loss: 0.3511 Epoch 72/100 90/90 [==============================] - 0s 766us/step - loss: 0.4331 - val_loss: 0.3516 Epoch 73/100 90/90 [==============================] - 0s 798us/step - loss: 0.4250 - val_loss: 0.3532 Epoch 74/100 90/90 [==============================] - 0s 750us/step - loss: 0.4440 - val_loss: 0.3578 Epoch 75/100 90/90 [==============================] - 0s 757us/step - loss: 0.4354 - val_loss: 0.3691 Epoch 76/100 90/90 [==============================] - 0s 745us/step - loss: 0.4319 - val_loss: 0.3500 Epoch 77/100 90/90 [==============================] - 0s 950us/step - loss: 0.4487 - val_loss: 0.3532 Epoch 78/100 90/90 [==============================] - 0s 1ms/step - loss: 0.4182 - val_loss: 0.3752 Epoch 79/100 90/90 [==============================] - 0s 759us/step - loss: 0.4312 - val_loss: 0.3658 Epoch 80/100 90/90 [==============================] - 0s 757us/step - loss: 0.4426 - val_loss: 0.3528 Epoch 81/100 90/90 [==============================] - 0s 754us/step - loss: 0.4315 - val_loss: 0.3549 Epoch 82/100 90/90 [==============================] - 0s 744us/step - loss: 0.4300 - val_loss: 0.3485 Epoch 83/100 90/90 [==============================] - 0s 735us/step - loss: 0.4490 - val_loss: 0.3593 Epoch 84/100 90/90 [==============================] - 0s 754us/step - loss: 0.4683 - val_loss: 0.3758 Epoch 85/100 90/90 [==============================] - 0s 748us/step - loss: 0.4313 - val_loss: 0.3762 Epoch 86/100 90/90 [==============================] - 0s 748us/step - loss: 0.4333 - val_loss: 0.3747 Epoch 87/100 90/90 [==============================] - 0s 769us/step - loss: 0.4306 - val_loss: 0.3763 Epoch 88/100 90/90 [==============================] - 0s 855us/step - loss: 0.4248 - val_loss: 0.3598 Epoch 89/100 90/90 [==============================] - 0s 793us/step - loss: 0.4437 - val_loss: 0.3483 Epoch 90/100 90/90 [==============================] - 0s 739us/step - loss: 0.4327 - val_loss: 0.3536 Epoch 91/100 90/90 [==============================] - 0s 740us/step - loss: 0.4483 - val_loss: 0.3870 Epoch 92/100 90/90 [==============================] - 0s 741us/step - loss: 0.4420 - val_loss: 0.3514 Epoch 93/100 90/90 [==============================] - 0s 733us/step - loss: 0.4501 - val_loss: 0.3534 Epoch 94/100 90/90 [==============================] - 0s 738us/step - loss: 0.4483 - val_loss: 0.3924 Epoch 95/100 90/90 [==============================] - 0s 735us/step - loss: 0.4373 - val_loss: 0.3820 Epoch 96/100 90/90 [==============================] - 0s 737us/step - loss: 0.4497 - val_loss: 0.3563 Epoch 97/100 90/90 [==============================] - 0s 747us/step - loss: 0.4176 - val_loss: 0.3775 Epoch 98/100 90/90 [==============================] - 0s 743us/step - loss: 0.4471 - val_loss: 0.3531 Epoch 99/100 90/90 [==============================] - 0s 741us/step - loss: 0.4354 - val_loss: 0.3564 Epoch 100/100 90/90 [==============================] - 0s 732us/step - loss: 0.4554 - val_loss: 0.4081
<keras.callbacks.History at 0x1eba89ad7f0>
Pickle the model for future use.
# save the model
model_reg.save("model_reg.h5") #using h5 extension
print("model saved!!!")
model saved!!!
# load the model
model_rr = load_model('model_reg.h5')
error when trying to pickle is - TypeError: cannot pickle 'weakref' object and to resolve 'weakref' object we need to import dill and weakref butit cannot be saved with pickle, so I have used save() to save the model and load_model() to load it.
# Save the Modle to file in the current working directory
#Pkl_Filename = "Pickle_RR_Model.pkl"
#with open(Pkl_Filename, 'wb') as file:
# pickle.dump(model_reg, file)
# Load the Model back from file
#with open(Pkl_Filename, 'rb') as file:
# Pickled_RR_Model = pickle.load(file)
#Pickled_RR_Model
y_pred = model_rr.predict(X_test)
print(y_pred[0])
print(y_pred[1])
print(y_pred[2])
print(y_pred[3])
print(y_pred[4])
[5.81087] [5.8921013] [6.152836] [5.508875] [6.0615234]
print(y_test.head())
1526 6.0 674 6.0 1508 6.0 58 5.0 1351 6.0 Name: Signal_Strength, dtype: float64
The first 5 elements of y_pred and y_test are close.
score_r = r2_score(y_test,y_pred)
print(score_r)
0.2768244181621704
#summary of regression model
model_rr.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
batch_normalization (BatchN (None, 11) 44
ormalization)
dense (Dense) (None, 1) 12
=================================================================
Total params: 56
Trainable params: 34
Non-trainable params: 22
_________________________________________________________________
The need is to build a classifier which can use these parameters to determine the signal strength or quality
Steps 1 and 2 are same as for the regressor above
Design, train, tune and test a neural network classifier.
# counting the number of classes in output
mydata['Signal_Strength'].value_counts()
5.0 681 6.0 638 7.0 199 4.0 53 7.5 18 3.5 10 Name: Signal_Strength, dtype: int64
X.shape
(1599, 11)
y.shape
(1599,)
yc = to_categorical(y, num_classes=8)
# splitting data for test of categorial
Xcv_train, Xc_test, ycv_train, yc_test = train_test_split(X, yc, test_size=.30, random_state=seed)
print("Shape of y_train:", ycv_train.shape)
print("One value of y_train:", ycv_train[0])
Shape of y_train: (1119, 8) One value of y_train: [0. 0. 0. 0. 0. 0. 0. 1.]
# splitting data for train and validation of categorial
Xc_train, Xc_val, yc_train, yc_val = train_test_split(Xcv_train, ycv_train, test_size=.20, random_state=seed)
print("Shape of y_train:", yc_train.shape)
print("One value of y_train:", yc_train[0])
Shape of y_train: (895, 8) One value of y_train: [0. 0. 0. 0. 0. 1. 0. 0.]
model_class = Sequential()
model_class.add(Dense(11, activation='relu'))
model_class.add(Dense(8, activation='relu'))
model_class.add(Dense(8, activation='softmax'))
# Compile the model
model_class.compile(loss="categorical_crossentropy", metrics=["accuracy"], optimizer="sgd")
# Fit the model
model_class.fit(x=Xc_train, y=yc_train, batch_size=20, epochs=100, validation_data=(Xc_val, yc_val))
Epoch 1/100 45/45 [==============================] - 0s 3ms/step - loss: 2.2498 - accuracy: 0.3330 - val_loss: 1.9743 - val_accuracy: 0.5134 Epoch 2/100 45/45 [==============================] - 0s 982us/step - loss: 1.9310 - accuracy: 0.4503 - val_loss: 1.8608 - val_accuracy: 0.5134 Epoch 3/100 45/45 [==============================] - 0s 935us/step - loss: 1.7944 - accuracy: 0.4659 - val_loss: 1.7492 - val_accuracy: 0.4955 Epoch 4/100 45/45 [==============================] - 0s 996us/step - loss: 1.6702 - accuracy: 0.4737 - val_loss: 1.6036 - val_accuracy: 0.5402 Epoch 5/100 45/45 [==============================] - 0s 935us/step - loss: 1.4860 - accuracy: 0.4436 - val_loss: 1.3546 - val_accuracy: 0.3527 Epoch 6/100 45/45 [==============================] - 0s 960us/step - loss: 1.2666 - accuracy: 0.3866 - val_loss: 1.0943 - val_accuracy: 0.3795 Epoch 7/100 45/45 [==============================] - 0s 997us/step - loss: 1.1846 - accuracy: 0.4302 - val_loss: 1.0526 - val_accuracy: 0.5045 Epoch 8/100 45/45 [==============================] - 0s 974us/step - loss: 1.1741 - accuracy: 0.4268 - val_loss: 1.0711 - val_accuracy: 0.4509 Epoch 9/100 45/45 [==============================] - 0s 987us/step - loss: 1.1746 - accuracy: 0.4302 - val_loss: 1.0576 - val_accuracy: 0.4866 Epoch 10/100 45/45 [==============================] - 0s 993us/step - loss: 1.1671 - accuracy: 0.4134 - val_loss: 1.0576 - val_accuracy: 0.5134 Epoch 11/100 45/45 [==============================] - 0s 991us/step - loss: 1.1505 - accuracy: 0.4402 - val_loss: 0.9921 - val_accuracy: 0.5223 Epoch 12/100 45/45 [==============================] - 0s 988us/step - loss: 1.1250 - accuracy: 0.4838 - val_loss: 0.9899 - val_accuracy: 0.5714 Epoch 13/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1249 - accuracy: 0.4860 - val_loss: 1.0204 - val_accuracy: 0.5312 Epoch 14/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1288 - accuracy: 0.4704 - val_loss: 1.0572 - val_accuracy: 0.4464 Epoch 15/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1256 - accuracy: 0.4782 - val_loss: 0.9713 - val_accuracy: 0.5580 Epoch 16/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1269 - accuracy: 0.4771 - val_loss: 1.0298 - val_accuracy: 0.5446 Epoch 17/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1124 - accuracy: 0.4838 - val_loss: 0.9625 - val_accuracy: 0.5625 Epoch 18/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1117 - accuracy: 0.4838 - val_loss: 0.9744 - val_accuracy: 0.5625 Epoch 19/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1109 - accuracy: 0.4927 - val_loss: 0.9840 - val_accuracy: 0.5402 Epoch 20/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1156 - accuracy: 0.4793 - val_loss: 0.9744 - val_accuracy: 0.5402 Epoch 21/100 45/45 [==============================] - 0s 986us/step - loss: 1.1142 - accuracy: 0.4782 - val_loss: 0.9850 - val_accuracy: 0.5268 Epoch 22/100 45/45 [==============================] - 0s 975us/step - loss: 1.1163 - accuracy: 0.4961 - val_loss: 0.9894 - val_accuracy: 0.5402 Epoch 23/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1074 - accuracy: 0.5039 - val_loss: 0.9830 - val_accuracy: 0.5268 Epoch 24/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1096 - accuracy: 0.4939 - val_loss: 0.9636 - val_accuracy: 0.5536 Epoch 25/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1127 - accuracy: 0.4749 - val_loss: 0.9649 - val_accuracy: 0.5446 Epoch 26/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1116 - accuracy: 0.5084 - val_loss: 0.9811 - val_accuracy: 0.5491 Epoch 27/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1056 - accuracy: 0.4838 - val_loss: 0.9632 - val_accuracy: 0.5491 Epoch 28/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1054 - accuracy: 0.4983 - val_loss: 0.9691 - val_accuracy: 0.5446 Epoch 29/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1133 - accuracy: 0.4827 - val_loss: 0.9700 - val_accuracy: 0.5536 Epoch 30/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1061 - accuracy: 0.4927 - val_loss: 0.9589 - val_accuracy: 0.5402 Epoch 31/100 45/45 [==============================] - 0s 956us/step - loss: 1.1058 - accuracy: 0.4883 - val_loss: 0.9522 - val_accuracy: 0.5402 Epoch 32/100 45/45 [==============================] - 0s 943us/step - loss: 1.1002 - accuracy: 0.4916 - val_loss: 0.9915 - val_accuracy: 0.5491 Epoch 33/100 45/45 [==============================] - 0s 991us/step - loss: 1.1039 - accuracy: 0.4950 - val_loss: 0.9767 - val_accuracy: 0.5357 Epoch 34/100 45/45 [==============================] - 0s 971us/step - loss: 1.1007 - accuracy: 0.4838 - val_loss: 0.9743 - val_accuracy: 0.5536 Epoch 35/100 45/45 [==============================] - 0s 915us/step - loss: 1.1043 - accuracy: 0.4860 - val_loss: 0.9593 - val_accuracy: 0.5580 Epoch 36/100 45/45 [==============================] - 0s 933us/step - loss: 1.1018 - accuracy: 0.5006 - val_loss: 0.9862 - val_accuracy: 0.5357 Epoch 37/100 45/45 [==============================] - 0s 966us/step - loss: 1.1003 - accuracy: 0.4905 - val_loss: 0.9597 - val_accuracy: 0.5536 Epoch 38/100 45/45 [==============================] - 0s 954us/step - loss: 1.1082 - accuracy: 0.4860 - val_loss: 0.9952 - val_accuracy: 0.5268 Epoch 39/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1004 - accuracy: 0.4994 - val_loss: 0.9844 - val_accuracy: 0.5312 Epoch 40/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0990 - accuracy: 0.4961 - val_loss: 0.9787 - val_accuracy: 0.5312 Epoch 41/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0942 - accuracy: 0.5050 - val_loss: 1.0182 - val_accuracy: 0.4911 Epoch 42/100 45/45 [==============================] - 0s 1ms/step - loss: 1.1025 - accuracy: 0.4994 - val_loss: 0.9618 - val_accuracy: 0.5670 Epoch 43/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0996 - accuracy: 0.5017 - val_loss: 1.0249 - val_accuracy: 0.4732 Epoch 44/100 45/45 [==============================] - 0s 944us/step - loss: 1.0995 - accuracy: 0.4983 - val_loss: 0.9886 - val_accuracy: 0.5268 Epoch 45/100 45/45 [==============================] - 0s 997us/step - loss: 1.0855 - accuracy: 0.5039 - val_loss: 0.9816 - val_accuracy: 0.5446 Epoch 46/100 45/45 [==============================] - 0s 966us/step - loss: 1.0969 - accuracy: 0.4771 - val_loss: 0.9646 - val_accuracy: 0.5491 Epoch 47/100 45/45 [==============================] - 0s 968us/step - loss: 1.0992 - accuracy: 0.4905 - val_loss: 1.0012 - val_accuracy: 0.5536 Epoch 48/100 45/45 [==============================] - 0s 948us/step - loss: 1.0947 - accuracy: 0.4950 - val_loss: 1.1465 - val_accuracy: 0.3750 Epoch 49/100 45/45 [==============================] - 0s 957us/step - loss: 1.1022 - accuracy: 0.4827 - val_loss: 0.9743 - val_accuracy: 0.5491 Epoch 50/100 45/45 [==============================] - 0s 974us/step - loss: 1.0954 - accuracy: 0.5095 - val_loss: 0.9569 - val_accuracy: 0.5402 Epoch 51/100 45/45 [==============================] - 0s 931us/step - loss: 1.0996 - accuracy: 0.4939 - val_loss: 0.9785 - val_accuracy: 0.5536 Epoch 52/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0918 - accuracy: 0.5073 - val_loss: 0.9918 - val_accuracy: 0.5446 Epoch 53/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0913 - accuracy: 0.5106 - val_loss: 0.9574 - val_accuracy: 0.5446 Epoch 54/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0935 - accuracy: 0.4927 - val_loss: 0.9539 - val_accuracy: 0.5357 Epoch 55/100 45/45 [==============================] - 0s 981us/step - loss: 1.0958 - accuracy: 0.4927 - val_loss: 1.0062 - val_accuracy: 0.5134 Epoch 56/100 45/45 [==============================] - 0s 994us/step - loss: 1.0904 - accuracy: 0.4883 - val_loss: 0.9771 - val_accuracy: 0.5089 Epoch 57/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0950 - accuracy: 0.4916 - val_loss: 0.9599 - val_accuracy: 0.5536 Epoch 58/100 45/45 [==============================] - 0s 980us/step - loss: 1.0980 - accuracy: 0.5117 - val_loss: 0.9830 - val_accuracy: 0.5491 Epoch 59/100 45/45 [==============================] - 0s 946us/step - loss: 1.0882 - accuracy: 0.5006 - val_loss: 0.9611 - val_accuracy: 0.5446 Epoch 60/100 45/45 [==============================] - 0s 972us/step - loss: 1.0947 - accuracy: 0.4927 - val_loss: 0.9512 - val_accuracy: 0.5491 Epoch 61/100 45/45 [==============================] - 0s 964us/step - loss: 1.0921 - accuracy: 0.4927 - val_loss: 0.9822 - val_accuracy: 0.5580 Epoch 62/100 45/45 [==============================] - 0s 978us/step - loss: 1.0961 - accuracy: 0.4972 - val_loss: 0.9815 - val_accuracy: 0.5446 Epoch 63/100 45/45 [==============================] - 0s 939us/step - loss: 1.0855 - accuracy: 0.5017 - val_loss: 0.9501 - val_accuracy: 0.5446 Epoch 64/100 45/45 [==============================] - 0s 953us/step - loss: 1.0920 - accuracy: 0.5006 - val_loss: 0.9605 - val_accuracy: 0.5491 Epoch 65/100 45/45 [==============================] - 0s 978us/step - loss: 1.0847 - accuracy: 0.4961 - val_loss: 0.9637 - val_accuracy: 0.5446 Epoch 66/100 45/45 [==============================] - 0s 994us/step - loss: 1.0950 - accuracy: 0.4883 - val_loss: 0.9862 - val_accuracy: 0.5357 Epoch 67/100 45/45 [==============================] - 0s 995us/step - loss: 1.0872 - accuracy: 0.4939 - val_loss: 0.9965 - val_accuracy: 0.5312 Epoch 68/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0862 - accuracy: 0.4894 - val_loss: 1.0282 - val_accuracy: 0.5179 Epoch 69/100 45/45 [==============================] - 0s 929us/step - loss: 1.0856 - accuracy: 0.5028 - val_loss: 1.1264 - val_accuracy: 0.3795 Epoch 70/100 45/45 [==============================] - 0s 933us/step - loss: 1.0886 - accuracy: 0.4927 - val_loss: 0.9726 - val_accuracy: 0.5446 Epoch 71/100 45/45 [==============================] - 0s 971us/step - loss: 1.0808 - accuracy: 0.5073 - val_loss: 0.9555 - val_accuracy: 0.5536 Epoch 72/100 45/45 [==============================] - 0s 962us/step - loss: 1.0865 - accuracy: 0.5084 - val_loss: 0.9613 - val_accuracy: 0.5625 Epoch 73/100 45/45 [==============================] - 0s 937us/step - loss: 1.0870 - accuracy: 0.4916 - val_loss: 1.0319 - val_accuracy: 0.4777 Epoch 74/100 45/45 [==============================] - 0s 952us/step - loss: 1.0858 - accuracy: 0.5061 - val_loss: 0.9496 - val_accuracy: 0.5312 Epoch 75/100 45/45 [==============================] - 0s 947us/step - loss: 1.0810 - accuracy: 0.5196 - val_loss: 0.9557 - val_accuracy: 0.5446 Epoch 76/100 45/45 [==============================] - 0s 944us/step - loss: 1.0814 - accuracy: 0.4961 - val_loss: 0.9539 - val_accuracy: 0.5670 Epoch 77/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0785 - accuracy: 0.5140 - val_loss: 0.9620 - val_accuracy: 0.5580 Epoch 78/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0875 - accuracy: 0.4983 - val_loss: 1.0209 - val_accuracy: 0.5089 Epoch 79/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0844 - accuracy: 0.4983 - val_loss: 0.9814 - val_accuracy: 0.5268 Epoch 80/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0841 - accuracy: 0.5196 - val_loss: 0.9884 - val_accuracy: 0.5268 Epoch 81/100 45/45 [==============================] - 0s 944us/step - loss: 1.0872 - accuracy: 0.5017 - val_loss: 0.9604 - val_accuracy: 0.5491 Epoch 82/100 45/45 [==============================] - 0s 942us/step - loss: 1.0878 - accuracy: 0.5218 - val_loss: 0.9580 - val_accuracy: 0.5491 Epoch 83/100 45/45 [==============================] - 0s 960us/step - loss: 1.0820 - accuracy: 0.4939 - val_loss: 1.0109 - val_accuracy: 0.5179 Epoch 84/100 45/45 [==============================] - 0s 912us/step - loss: 1.0910 - accuracy: 0.4972 - val_loss: 0.9585 - val_accuracy: 0.5357 Epoch 85/100 45/45 [==============================] - 0s 929us/step - loss: 1.0769 - accuracy: 0.5073 - val_loss: 0.9487 - val_accuracy: 0.5402 Epoch 86/100 45/45 [==============================] - 0s 948us/step - loss: 1.0877 - accuracy: 0.5006 - val_loss: 0.9469 - val_accuracy: 0.5670 Epoch 87/100 45/45 [==============================] - 0s 943us/step - loss: 1.0914 - accuracy: 0.4950 - val_loss: 0.9439 - val_accuracy: 0.5670 Epoch 88/100 45/45 [==============================] - 0s 894us/step - loss: 1.0783 - accuracy: 0.5140 - val_loss: 0.9471 - val_accuracy: 0.5759 Epoch 89/100 45/45 [==============================] - 0s 893us/step - loss: 1.0801 - accuracy: 0.4849 - val_loss: 1.0123 - val_accuracy: 0.4643 Epoch 90/100 45/45 [==============================] - 0s 949us/step - loss: 1.0804 - accuracy: 0.5061 - val_loss: 0.9439 - val_accuracy: 0.5625 Epoch 91/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0789 - accuracy: 0.5006 - val_loss: 0.9485 - val_accuracy: 0.5536 Epoch 92/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0972 - accuracy: 0.5073 - val_loss: 0.9483 - val_accuracy: 0.5491 Epoch 93/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0863 - accuracy: 0.5073 - val_loss: 0.9661 - val_accuracy: 0.5625 Epoch 94/100 45/45 [==============================] - 0s 1ms/step - loss: 1.0765 - accuracy: 0.5173 - val_loss: 0.9434 - val_accuracy: 0.5402 Epoch 95/100 45/45 [==============================] - 0s 959us/step - loss: 1.0779 - accuracy: 0.4994 - val_loss: 0.9430 - val_accuracy: 0.5491 Epoch 96/100 45/45 [==============================] - 0s 965us/step - loss: 1.0893 - accuracy: 0.4961 - val_loss: 0.9412 - val_accuracy: 0.5714 Epoch 97/100 45/45 [==============================] - 0s 938us/step - loss: 1.0773 - accuracy: 0.5128 - val_loss: 0.9467 - val_accuracy: 0.5536 Epoch 98/100 45/45 [==============================] - 0s 965us/step - loss: 1.0747 - accuracy: 0.5140 - val_loss: 0.9566 - val_accuracy: 0.5268 Epoch 99/100 45/45 [==============================] - 0s 933us/step - loss: 1.0718 - accuracy: 0.5263 - val_loss: 0.9558 - val_accuracy: 0.5491 Epoch 100/100 45/45 [==============================] - 0s 922us/step - loss: 1.0775 - accuracy: 0.5251 - val_loss: 0.9612 - val_accuracy: 0.5402
<keras.callbacks.History at 0x1ebafc151f0>
Pickle the model for future use.
# save the model
model_class.save("model_class.h5") #using h5 extension
print("model saved!!!")
model saved!!!
# load the model
model_cl = load_model('model_class.h5')
# calculate score of training data
score = model_cl.evaluate(Xc_train, yc_train, verbose=0)
print(score)
[1.064502477645874, 0.5094972252845764]
# score of test data
score_t = model_cl.evaluate(Xc_test, yc_test, verbose=0)
print( score_t)
[1.0749679803848267, 0.518750011920929]
#summary of classification model
model_cl.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_1 (Dense) (None, 11) 132
dense_2 (Dense) (None, 8) 96
dense_3 (Dense) (None, 8) 72
=================================================================
Total params: 300
Trainable params: 300
Non-trainable params: 0
_________________________________________________________________
mydata = DB
import sklearn
from sklearn.model_selection import train_test_split
# Independent variables
X=DB.drop('Signal_Strength',axis=1)
# Target variable
Y=DB['Signal_Strength']
X_Train,X_Test,Y_Train,Y_Test=train_test_split(X, Y, train_size=0.7, random_state=12)
from sklearn.preprocessing import StandardScaler
# Scaling train data
X_Train_S = StandardScaler().fit_transform(X_Train)
# Scaling test data
X_Test_S = StandardScaler().fit_transform(X_Test)
# Converting y data into categorical (one-hot encoding)
from keras.utils.np_utils import to_categorical
Y_Train = to_categorical(Y_Train)
Y_Test = to_categorical(Y_Test)
# Confirming Matrix size
print(X_Train_S.shape)
print(X_Test_S.shape)
print(Y_Train.shape)
print(Y_Test.shape)
(1119, 11) (480, 11) (1119, 9) (480, 9)
from keras.models import Sequential # Forward prop
from keras.layers import Dense, Activation, LeakyReLU
from keras import optimizers
NN_model_Classifier = Sequential()
# The Input Layer :
NN_model_Classifier.add(Dense(128, kernel_initializer='normal',input_dim = X_Train.shape[1], activation='relu'))
# The Hidden Layers :
NN_model_Classifier.add(Dense(64, kernel_initializer='normal',activation='relu')) # sigmoid, tanh
NN_model_Classifier.add(Dense(32, kernel_initializer='normal'))
NN_model_Classifier.add(LeakyReLU(alpha=0.1))
NN_model_Classifier.add(Dense(16, kernel_initializer='normal'))
NN_model_Classifier.add(LeakyReLU(alpha=0.1))
# The Output Layer :
NN_model_Classifier.add(Dense(9, kernel_initializer='normal',activation='softmax')) # except softmax
# Compile the network :
NN_model_Classifier.compile(loss='mean_absolute_error', optimizer='adam', metrics=['accuracy'])
NN_model_Classifier.summary()
Model: "sequential_2"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_4 (Dense) (None, 128) 1536
dense_5 (Dense) (None, 64) 8256
dense_6 (Dense) (None, 32) 2080
leaky_re_lu (LeakyReLU) (None, 32) 0
dense_7 (Dense) (None, 16) 528
leaky_re_lu_1 (LeakyReLU) (None, 16) 0
dense_8 (Dense) (None, 9) 153
=================================================================
Total params: 12,553
Trainable params: 12,553
Non-trainable params: 0
_________________________________________________________________
EPOCH=400
Network_Classifier=NN_model_Classifier.fit(X_Train_S, Y_Train, validation_data=(X_Test_S,Y_Test), epochs=EPOCH, batch_size=200)
Epoch 1/400 6/6 [==============================] - 0s 18ms/step - loss: 0.1974 - accuracy: 0.3181 - val_loss: 0.1973 - val_accuracy: 0.4042 Epoch 2/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1972 - accuracy: 0.4352 - val_loss: 0.1970 - val_accuracy: 0.4542 Epoch 3/400 6/6 [==============================] - 0s 6ms/step - loss: 0.1969 - accuracy: 0.4173 - val_loss: 0.1965 - val_accuracy: 0.4146 Epoch 4/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1962 - accuracy: 0.3923 - val_loss: 0.1955 - val_accuracy: 0.4146 Epoch 5/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1948 - accuracy: 0.3923 - val_loss: 0.1929 - val_accuracy: 0.4146 Epoch 6/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1913 - accuracy: 0.3923 - val_loss: 0.1863 - val_accuracy: 0.4146 Epoch 7/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1819 - accuracy: 0.3923 - val_loss: 0.1698 - val_accuracy: 0.4146 Epoch 8/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1618 - accuracy: 0.3923 - val_loss: 0.1436 - val_accuracy: 0.4146 Epoch 9/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1400 - accuracy: 0.3941 - val_loss: 0.1296 - val_accuracy: 0.4354 Epoch 10/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1310 - accuracy: 0.4218 - val_loss: 0.1251 - val_accuracy: 0.4354 Epoch 11/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1271 - accuracy: 0.4218 - val_loss: 0.1234 - val_accuracy: 0.4354 Epoch 12/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1259 - accuracy: 0.4218 - val_loss: 0.1229 - val_accuracy: 0.4354 Epoch 13/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1255 - accuracy: 0.4218 - val_loss: 0.1218 - val_accuracy: 0.4354 Epoch 14/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1244 - accuracy: 0.4218 - val_loss: 0.1200 - val_accuracy: 0.4354 Epoch 15/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1226 - accuracy: 0.4218 - val_loss: 0.1185 - val_accuracy: 0.4354 Epoch 16/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1215 - accuracy: 0.4218 - val_loss: 0.1179 - val_accuracy: 0.4354 Epoch 17/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1208 - accuracy: 0.4218 - val_loss: 0.1168 - val_accuracy: 0.4354 Epoch 18/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1198 - accuracy: 0.4218 - val_loss: 0.1162 - val_accuracy: 0.4354 Epoch 19/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1192 - accuracy: 0.4218 - val_loss: 0.1155 - val_accuracy: 0.4354 Epoch 20/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1186 - accuracy: 0.4218 - val_loss: 0.1150 - val_accuracy: 0.4354 Epoch 21/400 6/6 [==============================] - 0s 4ms/step - loss: 0.1180 - accuracy: 0.4218 - val_loss: 0.1143 - val_accuracy: 0.4354 Epoch 22/400 6/6 [==============================] - 0s 4ms/step - loss: 0.1174 - accuracy: 0.4218 - val_loss: 0.1136 - val_accuracy: 0.4354 Epoch 23/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1168 - accuracy: 0.4218 - val_loss: 0.1130 - val_accuracy: 0.4354 Epoch 24/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1162 - accuracy: 0.4218 - val_loss: 0.1124 - val_accuracy: 0.4354 Epoch 25/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1157 - accuracy: 0.4218 - val_loss: 0.1117 - val_accuracy: 0.4354 Epoch 26/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1150 - accuracy: 0.4218 - val_loss: 0.1110 - val_accuracy: 0.4583 Epoch 27/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1143 - accuracy: 0.4415 - val_loss: 0.1102 - val_accuracy: 0.5000 Epoch 28/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1135 - accuracy: 0.5031 - val_loss: 0.1093 - val_accuracy: 0.5583 Epoch 29/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1127 - accuracy: 0.5353 - val_loss: 0.1079 - val_accuracy: 0.5625 Epoch 30/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1114 - accuracy: 0.5380 - val_loss: 0.1063 - val_accuracy: 0.5708 Epoch 31/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1097 - accuracy: 0.5559 - val_loss: 0.1043 - val_accuracy: 0.5938 Epoch 32/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1077 - accuracy: 0.5675 - val_loss: 0.1013 - val_accuracy: 0.6083 Epoch 33/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1049 - accuracy: 0.5728 - val_loss: 0.0979 - val_accuracy: 0.6125 Epoch 34/400 6/6 [==============================] - 0s 5ms/step - loss: 0.1016 - accuracy: 0.5755 - val_loss: 0.0940 - val_accuracy: 0.6125 Epoch 35/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0983 - accuracy: 0.5853 - val_loss: 0.0913 - val_accuracy: 0.6167 Epoch 36/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0955 - accuracy: 0.5836 - val_loss: 0.0894 - val_accuracy: 0.6125 Epoch 37/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0940 - accuracy: 0.5862 - val_loss: 0.0886 - val_accuracy: 0.6167 Epoch 38/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0932 - accuracy: 0.5871 - val_loss: 0.0884 - val_accuracy: 0.6167 Epoch 39/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0926 - accuracy: 0.5862 - val_loss: 0.0868 - val_accuracy: 0.6167 Epoch 40/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0921 - accuracy: 0.5853 - val_loss: 0.0874 - val_accuracy: 0.6146 Epoch 41/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0915 - accuracy: 0.5907 - val_loss: 0.0882 - val_accuracy: 0.6083 Epoch 42/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0911 - accuracy: 0.5952 - val_loss: 0.0873 - val_accuracy: 0.6167 Epoch 43/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0906 - accuracy: 0.5934 - val_loss: 0.0868 - val_accuracy: 0.6125 Epoch 44/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0905 - accuracy: 0.5961 - val_loss: 0.0864 - val_accuracy: 0.6146 Epoch 45/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0899 - accuracy: 0.6014 - val_loss: 0.0886 - val_accuracy: 0.6021 Epoch 46/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0902 - accuracy: 0.6005 - val_loss: 0.0869 - val_accuracy: 0.6146 Epoch 47/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0895 - accuracy: 0.6005 - val_loss: 0.0857 - val_accuracy: 0.6187 Epoch 48/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0892 - accuracy: 0.6023 - val_loss: 0.0861 - val_accuracy: 0.6146 Epoch 49/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0891 - accuracy: 0.6059 - val_loss: 0.0868 - val_accuracy: 0.6146 Epoch 50/400 6/6 [==============================] - 0s 7ms/step - loss: 0.0882 - accuracy: 0.6077 - val_loss: 0.0850 - val_accuracy: 0.6187 Epoch 51/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0879 - accuracy: 0.6104 - val_loss: 0.0863 - val_accuracy: 0.6125 Epoch 52/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0877 - accuracy: 0.6122 - val_loss: 0.0857 - val_accuracy: 0.6187 Epoch 53/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0874 - accuracy: 0.6122 - val_loss: 0.0858 - val_accuracy: 0.6167 Epoch 54/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0869 - accuracy: 0.6139 - val_loss: 0.0858 - val_accuracy: 0.6146 Epoch 55/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0867 - accuracy: 0.6148 - val_loss: 0.0855 - val_accuracy: 0.6167 Epoch 56/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0865 - accuracy: 0.6166 - val_loss: 0.0860 - val_accuracy: 0.6146 Epoch 57/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0863 - accuracy: 0.6157 - val_loss: 0.0854 - val_accuracy: 0.6167 Epoch 58/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0862 - accuracy: 0.6166 - val_loss: 0.0857 - val_accuracy: 0.6146 Epoch 59/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0859 - accuracy: 0.6202 - val_loss: 0.0860 - val_accuracy: 0.6146 Epoch 60/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0857 - accuracy: 0.6193 - val_loss: 0.0857 - val_accuracy: 0.6146 Epoch 61/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0856 - accuracy: 0.6193 - val_loss: 0.0856 - val_accuracy: 0.6146 Epoch 62/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0855 - accuracy: 0.6193 - val_loss: 0.0856 - val_accuracy: 0.6146 Epoch 63/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0853 - accuracy: 0.6202 - val_loss: 0.0857 - val_accuracy: 0.6146 Epoch 64/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0854 - accuracy: 0.6193 - val_loss: 0.0858 - val_accuracy: 0.6146 Epoch 65/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0851 - accuracy: 0.6238 - val_loss: 0.0856 - val_accuracy: 0.6125 Epoch 66/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0849 - accuracy: 0.6202 - val_loss: 0.0858 - val_accuracy: 0.6167 Epoch 67/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0848 - accuracy: 0.6229 - val_loss: 0.0855 - val_accuracy: 0.6125 Epoch 68/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0847 - accuracy: 0.6238 - val_loss: 0.0861 - val_accuracy: 0.6125 Epoch 69/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0845 - accuracy: 0.6247 - val_loss: 0.0858 - val_accuracy: 0.6125 Epoch 70/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0843 - accuracy: 0.6256 - val_loss: 0.0857 - val_accuracy: 0.6146 Epoch 71/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0842 - accuracy: 0.6247 - val_loss: 0.0858 - val_accuracy: 0.6083 Epoch 72/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0844 - accuracy: 0.6256 - val_loss: 0.0860 - val_accuracy: 0.6125 Epoch 73/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0845 - accuracy: 0.6238 - val_loss: 0.0861 - val_accuracy: 0.6125 Epoch 74/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0843 - accuracy: 0.6256 - val_loss: 0.0856 - val_accuracy: 0.6083 Epoch 75/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0843 - accuracy: 0.6238 - val_loss: 0.0855 - val_accuracy: 0.6208 Epoch 76/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0839 - accuracy: 0.6265 - val_loss: 0.0852 - val_accuracy: 0.6146 Epoch 77/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0837 - accuracy: 0.6273 - val_loss: 0.0858 - val_accuracy: 0.6104 Epoch 78/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0835 - accuracy: 0.6273 - val_loss: 0.0854 - val_accuracy: 0.6208 Epoch 79/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0835 - accuracy: 0.6273 - val_loss: 0.0852 - val_accuracy: 0.6146 Epoch 80/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0834 - accuracy: 0.6256 - val_loss: 0.0861 - val_accuracy: 0.6125 Epoch 81/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0832 - accuracy: 0.6282 - val_loss: 0.0848 - val_accuracy: 0.6167 Epoch 82/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0832 - accuracy: 0.6291 - val_loss: 0.0844 - val_accuracy: 0.6229 Epoch 83/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0828 - accuracy: 0.6318 - val_loss: 0.0849 - val_accuracy: 0.6187 Epoch 84/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0826 - accuracy: 0.6309 - val_loss: 0.0848 - val_accuracy: 0.6229 Epoch 85/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0824 - accuracy: 0.6327 - val_loss: 0.0841 - val_accuracy: 0.6250 Epoch 86/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0822 - accuracy: 0.6336 - val_loss: 0.0841 - val_accuracy: 0.6271 Epoch 87/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0821 - accuracy: 0.6336 - val_loss: 0.0845 - val_accuracy: 0.6229 Epoch 88/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0818 - accuracy: 0.6345 - val_loss: 0.0835 - val_accuracy: 0.6292 Epoch 89/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0817 - accuracy: 0.6345 - val_loss: 0.0835 - val_accuracy: 0.6292 Epoch 90/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0815 - accuracy: 0.6354 - val_loss: 0.0840 - val_accuracy: 0.6229 Epoch 91/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0815 - accuracy: 0.6363 - val_loss: 0.0837 - val_accuracy: 0.6271 Epoch 92/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0812 - accuracy: 0.6372 - val_loss: 0.0836 - val_accuracy: 0.6250 Epoch 93/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0812 - accuracy: 0.6363 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 94/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0810 - accuracy: 0.6390 - val_loss: 0.0835 - val_accuracy: 0.6292 Epoch 95/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0809 - accuracy: 0.6390 - val_loss: 0.0840 - val_accuracy: 0.6229 Epoch 96/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0810 - accuracy: 0.6399 - val_loss: 0.0833 - val_accuracy: 0.6292 Epoch 97/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0810 - accuracy: 0.6399 - val_loss: 0.0836 - val_accuracy: 0.6229 Epoch 98/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0814 - accuracy: 0.6363 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 99/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0808 - accuracy: 0.6390 - val_loss: 0.0839 - val_accuracy: 0.6208 Epoch 100/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0806 - accuracy: 0.6408 - val_loss: 0.0836 - val_accuracy: 0.6250 Epoch 101/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0804 - accuracy: 0.6408 - val_loss: 0.0833 - val_accuracy: 0.6271 Epoch 102/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0804 - accuracy: 0.6399 - val_loss: 0.0835 - val_accuracy: 0.6292 Epoch 103/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0804 - accuracy: 0.6408 - val_loss: 0.0836 - val_accuracy: 0.6250 Epoch 104/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0805 - accuracy: 0.6399 - val_loss: 0.0835 - val_accuracy: 0.6229 Epoch 105/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0802 - accuracy: 0.6408 - val_loss: 0.0836 - val_accuracy: 0.6271 Epoch 106/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0801 - accuracy: 0.6416 - val_loss: 0.0841 - val_accuracy: 0.6229 Epoch 107/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0803 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 108/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0800 - accuracy: 0.6416 - val_loss: 0.0835 - val_accuracy: 0.6229 Epoch 109/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0800 - accuracy: 0.6408 - val_loss: 0.0841 - val_accuracy: 0.6187 Epoch 110/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0800 - accuracy: 0.6416 - val_loss: 0.0841 - val_accuracy: 0.6187 Epoch 111/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0801 - accuracy: 0.6416 - val_loss: 0.0838 - val_accuracy: 0.6208 Epoch 112/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0798 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6187 Epoch 113/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0798 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6208 Epoch 114/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0798 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6187 Epoch 115/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6208 Epoch 116/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6187 Epoch 117/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0837 - val_accuracy: 0.6250 Epoch 118/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 119/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6208 Epoch 120/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 121/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 122/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 123/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6208 Epoch 124/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0842 - val_accuracy: 0.6208 Epoch 125/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 126/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0797 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 127/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 128/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 129/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 130/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 131/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 132/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 133/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6229 Epoch 134/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 135/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6208 Epoch 136/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 137/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6229 Epoch 138/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 139/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 140/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 141/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6208 Epoch 142/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6229 Epoch 143/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6229 Epoch 144/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6208 Epoch 145/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 146/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 147/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 148/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6229 Epoch 149/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6208 Epoch 150/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6208 Epoch 151/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 152/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 153/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 154/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 155/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 156/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6208 Epoch 157/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 158/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6229 Epoch 159/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6208 Epoch 160/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 161/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 162/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 163/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 164/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 165/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 166/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 167/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 168/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 169/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 170/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0842 - val_accuracy: 0.6208 Epoch 171/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 172/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 173/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 174/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 175/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 176/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 177/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 178/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 179/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 180/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 181/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6229 Epoch 182/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 183/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 184/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 185/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 186/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 187/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 188/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 189/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 190/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 191/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 192/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 193/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 194/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 195/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6229 Epoch 196/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6229 Epoch 197/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 198/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 199/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0842 - val_accuracy: 0.6229 Epoch 200/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 201/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 202/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 203/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6229 Epoch 204/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6229 Epoch 205/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 206/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 207/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 208/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 209/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 210/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 211/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 212/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 213/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 214/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 215/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 216/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 217/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 218/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 219/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 220/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 221/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 222/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 223/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 224/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 225/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 226/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 227/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 228/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 229/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 230/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 231/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 232/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 233/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 234/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 235/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 236/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 237/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 238/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 239/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 240/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 241/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 242/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 243/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 244/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 245/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 246/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 247/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 248/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 249/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 250/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 251/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 252/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 253/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 254/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 255/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 256/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 257/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 258/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 259/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 260/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0841 - val_accuracy: 0.6208 Epoch 261/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 262/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 263/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 264/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 265/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 266/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 267/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 268/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0836 - val_accuracy: 0.6229 Epoch 269/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0840 - val_accuracy: 0.6208 Epoch 270/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0796 - accuracy: 0.6425 - val_loss: 0.0851 - val_accuracy: 0.6167 Epoch 271/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0799 - accuracy: 0.6408 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 272/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0805 - accuracy: 0.6381 - val_loss: 0.0855 - val_accuracy: 0.6146 Epoch 273/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0811 - accuracy: 0.6354 - val_loss: 0.0864 - val_accuracy: 0.6083 Epoch 274/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0801 - accuracy: 0.6399 - val_loss: 0.0844 - val_accuracy: 0.6187 Epoch 275/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0804 - accuracy: 0.6390 - val_loss: 0.0847 - val_accuracy: 0.6187 Epoch 276/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0798 - accuracy: 0.6416 - val_loss: 0.0861 - val_accuracy: 0.6125 Epoch 277/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0798 - accuracy: 0.6408 - val_loss: 0.0855 - val_accuracy: 0.6146 Epoch 278/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0798 - accuracy: 0.6416 - val_loss: 0.0852 - val_accuracy: 0.6167 Epoch 279/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0798 - accuracy: 0.6416 - val_loss: 0.0854 - val_accuracy: 0.6146 Epoch 280/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0797 - accuracy: 0.6416 - val_loss: 0.0849 - val_accuracy: 0.6167 Epoch 281/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0796 - accuracy: 0.6416 - val_loss: 0.0845 - val_accuracy: 0.6187 Epoch 282/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0836 - val_accuracy: 0.6250 Epoch 283/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0834 - val_accuracy: 0.6250 Epoch 284/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0836 - val_accuracy: 0.6250 Epoch 285/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 286/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6250 Epoch 287/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0795 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6250 Epoch 288/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 289/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 290/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 291/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 292/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 293/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 294/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 295/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 296/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 297/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 298/400 6/6 [==============================] - 0s 6ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0837 - val_accuracy: 0.6250 Epoch 299/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0837 - val_accuracy: 0.6250 Epoch 300/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 301/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 302/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 303/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 304/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 305/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 306/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 307/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 308/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 309/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 310/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 311/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 312/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 313/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 314/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 315/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 316/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 317/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 318/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 319/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6250 Epoch 320/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 321/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 322/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 323/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 324/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 325/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 326/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 327/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 328/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 329/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 330/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 331/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 332/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 333/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 334/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 335/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 336/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 337/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 338/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 339/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 340/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 341/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 342/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 343/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 344/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 345/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 346/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 347/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 348/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 349/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 350/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 351/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 352/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 353/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 354/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 355/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 356/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 357/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 358/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 359/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 360/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 361/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 362/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 363/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 364/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 365/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 366/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 367/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 368/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 369/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 370/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 371/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 372/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 373/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 374/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 375/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 376/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 377/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 378/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 379/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 380/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 381/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 382/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 383/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 384/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0838 - val_accuracy: 0.6229 Epoch 385/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 386/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 387/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 388/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 389/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 390/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 391/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 392/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 393/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 394/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 395/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 396/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 397/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 398/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 399/400 6/6 [==============================] - 0s 5ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229 Epoch 400/400 6/6 [==============================] - 0s 4ms/step - loss: 0.0794 - accuracy: 0.6425 - val_loss: 0.0839 - val_accuracy: 0.6229
loss_train = Network_Classifier.history['loss']
loss_val = Network_Classifier.history['val_loss']
epochs = range(1,EPOCH+1)
plt.plot(epochs, loss_train, 'g', label='Training loss')
plt.plot(epochs, loss_val, 'b', label='validation loss')
plt.title('Training and Validation loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend()
plt.show()
Acc_train = Network_Classifier.history['accuracy']
Acc_val = Network_Classifier.history['val_accuracy']
epochs = range(1,EPOCH+1)
plt.plot(epochs, Acc_train, 'g', label='Training accuracy')
plt.plot(epochs, Acc_val, 'b', label='validation accuracy')
plt.title('Training and Validation accuracy')
plt.xlabel('Epochs')
plt.ylabel('accuracy')
plt.legend()
plt.show()
from keras.models import model_from_json
import numpy
import os
# Pickle model to JSON
Classifier_model_json = NN_model_Classifier.to_json()
with open("Classifier_model.json", "w") as json_file:
json_file.write(Classifier_model_json)
# Pickle weights to HDF5
NN_model_Classifier.save_weights("Classifier_model.h5")
print("Saved model to disk")
# load json and create model
json_file = open('Classifier_model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("Classifier_model.h5")
print("Loaded model from disk")
# Evaluate
loaded_model.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
score = loaded_model.evaluate(X_Test_S,Y_Test, verbose=0)
print("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))
Saved model to disk Loaded model from disk accuracy: 62.29%
• DOMAIN: Autonomous Vehicles
• CONTEXT: A Recognising multi-digit numbers in photographs captured at street level is an important component of modern-day map making. A classic example of a corpus of such street-level photographs is Google’s Street View imagery composed of hundreds of millions of geo-located 360-degree panoramic images. The ability to automatically transcribe an address number from a geo-located patch of pixels and associate the transcribed number with a known street address helps pinpoint, with a high degree of accuracy, the location of the building it represents. More broadly, recognising numbers in photographs is a problem of interest to the optical character recognition community. While OCR on constrained domains like document processing is well studied, arbitrary multi-character text recognition in photographs is still highly challenging. This difficulty arises due to the wide variability in the visual appearance of text in the wild on account of a large range of fonts, colours, styles, orientations, and character arrangements. The recognition problem is further complicated by environmental factors such as lighting, shadows, specularity, and occlusions as well as by image acquisition factors such as resolution, motion, and focus blurs. In this project, we will use the dataset with images centred around a single digit (many of the images do contain some distractors at the sides). Although we are taking a sample of the data which is simpler, it is more complex than MNIST because of the distractors.
# Initialize the random number generator
import random
random.seed(1)
# Import necessary libraries
import h5py
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.metrics import confusion_matrix, accuracy_score, classification_report
# import tensorflow
import tensorflow as tf
# Read in image datafile
f = h5py.File('Autonomous_Vehicles_SVHN_single_grey1.h5', "r")
# List all groups
print(list(f.keys()))
['X_test', 'X_train', 'X_val', 'y_test', 'y_train', 'y_val']
# Read training, validation and test data in
X_train, X_val, X_test, y_train, y_val, y_test = np.array(f['X_train']), np.array(f['X_val']), np.array(f['X_test']), np.array(f['y_train']), np.array(f['y_val']), np.array(f['y_test'])
# Check out shapes of data read in
X_train.shape, X_val.shape, X_test.shape, y_train.shape, y_val.shape, y_test.shape
((42000, 32, 32), (60000, 32, 32), (18000, 32, 32), (42000,), (60000,), (18000,))
Data Visualization and Preprocessing
# Display first 10 images from training dataset
plt.figure(figsize=(16,10))
for i in range(10):
plt.subplot(5,5,i+1)
plt.grid(False)
plt.xticks([])
plt.yticks([])
plt.imshow(X_train[i], cmap=plt.cm.binary)
plt.xlabel(y_train[i])
plt.show()
# Display 10 random images from validation dataset
import random
plt.figure(figsize=(16,10))
for i in range(10):
j = random.randint(0,len(X_val)-1)
plt.subplot(5,5,i+1)
plt.grid(False)
plt.xticks([])
plt.yticks([])
plt.imshow(X_val[j], cmap=plt.cm.binary)
plt.xlabel(y_val[j])
plt.show()
# Display first 10 images from test dataset
plt.figure(figsize=(16,10))
for i in range(10):
plt.subplot(5,5,i+1)
plt.grid(False)
plt.xticks([])
plt.yticks([])
plt.imshow(X_test[i], cmap=plt.cm.binary)
plt.xlabel(y_test[i])
plt.show()
Reshape all the images with appropriate shape update the data in same variable.
print('X_train: (min, max):', (X_train.min(), X_train.max()))
print('X_val: (min, max):', (X_val.min(), X_val.max()))
print('X_test: (min, max):', (X_test.min(), X_test.max()))
X_train: (min, max): (0.0, 254.9745) X_val: (min, max): (0.0, 254.9745) X_test: (min, max): (0.0, 254.9745)
• Pixel values (image features) for all the images range between 0 and 254.9745.
• Pixel values of digital image should always be integer valued and must lie
between 0 and 255
• It can be inferred that the feature data provided for this classification project
doesn't seem to be raw data - but a processed data
• We scale features to range between 0 and 1 (Minmax scaling) - this will help
network alogorithm to run more efficiently (for e.g.: learning weights using SGD
algo)
Normalise the images i.e. Normalise the pixel values.
# Scale feature set
X_train /= 254.9745
X_val /= 254.9745
X_test /= 254.9745
# Look at the unique values target variable takes
print('y_train unique values:', set(y_train))
print('y_val unique values:', set(y_val))
print('y_test unique values:', set(y_test))
y_train unique values: {0, 1, 2, 3, 4, 5, 6, 7, 8, 9}
y_val unique values: {0, 1, 2, 3, 4, 5, 6, 7, 8, 9}
y_test unique values: {0, 1, 2, 3, 4, 5, 6, 7, 8, 9}
• Response variable labels range between 0 and 9 each representing the
prominent digit in an image.
• We can one-hot-encode it to make each digit a new class variable.
• We can further build Neural Network classifier on top of it with softmax activation
function in the output layer - so network will calculate probability based numbers
for each class per record to predict class membership.
# Let's also take a look at frequency of occurrence of each of the digits in training, validation and test datasets
fig, ax = plt.subplots(1, 3, figsize=(16, 5))
_ = sns.countplot(y_train, ax=ax[0]).set_title("Histogram of y_train")
_ = sns.countplot(y_val, ax=ax[1]).set_title("Histogram of y_val")
_ = sns.countplot(y_test, ax=ax[2]).set_title("Histogram of y_test")
Transform Labels into format acceptable by Neural Network
• We have nearly equal representation from each of the digit categories for all of
training, validation and test datasets
• We can infer that the data provided is well balanced
# One-Hot-Encode target variable
y_train_encoded = tf.keras.utils.to_categorical(y_train, num_classes=10)
y_val_encoded = tf.keras.utils.to_categorical(y_val, num_classes=10)
y_test_encoded = tf.keras.utils.to_categorical(y_test, num_classes=10)
Print total Number of classes in the Dataset.
print("Number of classes:", y_train.shape[1])
--------------------------------------------------------------------------- IndexError Traceback (most recent call last) ~\AppData\Local\Temp\ipykernel_48032\1903478369.py in <module> ----> 1 print("Number of classes:", y_train.shape[1]) IndexError: tuple index out of range
# Initialize Neural Network (Sequential) model
model = tf.keras.Sequential()
# Reshape the input of 32 x 32 image into 1d array with 1024 features
model.add(tf.keras.layers.Reshape(target_shape=(1024,), input_shape=(32,32,)))
# Add Layer 1 with 256 neurons and Leaky-ReLU activation function
model.add(tf.keras.layers.Dense(units=256,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL1'))
# Add Layer 2 with 128 neurons and Leaky-ReLU activation function
model.add(tf.keras.layers.Dense(units=128,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL2'))
# Add Layer 3 with 64 neurons and Leaky-ReLU activation function
model.add(tf.keras.layers.Dense(units=64,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL3'))
# Output Layer with 10 neurons and softmax activation function
model.add(tf.keras.layers.Dense(units=10, activation='softmax', name='Output'))
Train the classifier using previously designed Architecture (Use best suitable parameters).
# compile model
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.003),
loss='categorical_crossentropy', metrics=['accuracy'])
# Set early stopping criteria (i.e., no improvement in validation loss in 10 successive epochs)
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10,
restore_best_weights=True, mode='min')
# Train model
hist = model.fit(X_train, y_train_encoded, batch_size=64, epochs=150, verbose=1,
validation_data=(X_val, y_val_encoded), callbacks=[callback])
Epoch 1/150 657/657 [==============================] - 2s 3ms/step - loss: 2.2870 - accuracy: 0.1370 - val_loss: 2.0103 - val_accuracy: 0.2511 Epoch 2/150 657/657 [==============================] - 2s 3ms/step - loss: 1.5675 - accuracy: 0.4612 - val_loss: 1.2693 - val_accuracy: 0.5830 Epoch 3/150 657/657 [==============================] - 2s 3ms/step - loss: 1.2523 - accuracy: 0.5999 - val_loss: 1.1971 - val_accuracy: 0.6310 Epoch 4/150 657/657 [==============================] - 2s 2ms/step - loss: 1.1403 - accuracy: 0.6448 - val_loss: 1.0990 - val_accuracy: 0.6530 Epoch 5/150 657/657 [==============================] - 2s 3ms/step - loss: 1.0839 - accuracy: 0.6599 - val_loss: 1.0239 - val_accuracy: 0.6825 Epoch 6/150 657/657 [==============================] - 2s 3ms/step - loss: 1.0117 - accuracy: 0.6859 - val_loss: 0.9791 - val_accuracy: 0.6908 Epoch 7/150 657/657 [==============================] - 2s 3ms/step - loss: 0.9726 - accuracy: 0.7018 - val_loss: 0.9908 - val_accuracy: 0.6929 Epoch 8/150 657/657 [==============================] - 2s 3ms/step - loss: 0.9423 - accuracy: 0.7097 - val_loss: 0.9450 - val_accuracy: 0.7114 Epoch 9/150 657/657 [==============================] - 2s 3ms/step - loss: 0.9119 - accuracy: 0.7206 - val_loss: 0.9551 - val_accuracy: 0.7127 Epoch 10/150 657/657 [==============================] - 2s 3ms/step - loss: 0.8825 - accuracy: 0.7306 - val_loss: 0.7812 - val_accuracy: 0.7674 Epoch 11/150 657/657 [==============================] - 2s 3ms/step - loss: 0.8650 - accuracy: 0.7364 - val_loss: 0.8185 - val_accuracy: 0.7527 Epoch 12/150 657/657 [==============================] - 2s 3ms/step - loss: 0.8294 - accuracy: 0.7481 - val_loss: 0.7833 - val_accuracy: 0.7644 Epoch 13/150 657/657 [==============================] - 2s 3ms/step - loss: 0.8106 - accuracy: 0.7506 - val_loss: 0.8830 - val_accuracy: 0.7369 Epoch 14/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7872 - accuracy: 0.7601 - val_loss: 0.8213 - val_accuracy: 0.7527 Epoch 15/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7824 - accuracy: 0.7616 - val_loss: 0.7292 - val_accuracy: 0.7782 Epoch 16/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7570 - accuracy: 0.7684 - val_loss: 0.7690 - val_accuracy: 0.7630 Epoch 17/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7617 - accuracy: 0.7673 - val_loss: 0.8030 - val_accuracy: 0.7539 Epoch 18/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7441 - accuracy: 0.7744 - val_loss: 0.7538 - val_accuracy: 0.7706 Epoch 19/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7357 - accuracy: 0.7748 - val_loss: 0.6860 - val_accuracy: 0.7936 Epoch 20/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7149 - accuracy: 0.7818 - val_loss: 0.7504 - val_accuracy: 0.7780 Epoch 21/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7132 - accuracy: 0.7830 - val_loss: 0.7795 - val_accuracy: 0.7580 Epoch 22/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6955 - accuracy: 0.7888 - val_loss: 0.7209 - val_accuracy: 0.7810 Epoch 23/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7003 - accuracy: 0.7850 - val_loss: 0.7188 - val_accuracy: 0.7821 Epoch 24/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7090 - accuracy: 0.7825 - val_loss: 0.7249 - val_accuracy: 0.7753 Epoch 25/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6803 - accuracy: 0.7904 - val_loss: 0.6640 - val_accuracy: 0.7992 Epoch 26/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6742 - accuracy: 0.7932 - val_loss: 0.6747 - val_accuracy: 0.7945 Epoch 27/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6676 - accuracy: 0.7953 - val_loss: 0.7684 - val_accuracy: 0.7631 Epoch 28/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6574 - accuracy: 0.7984 - val_loss: 0.7489 - val_accuracy: 0.7727 Epoch 29/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6591 - accuracy: 0.7974 - val_loss: 0.6625 - val_accuracy: 0.7977 Epoch 30/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6502 - accuracy: 0.7995 - val_loss: 0.6561 - val_accuracy: 0.8024 Epoch 31/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6539 - accuracy: 0.7988 - val_loss: 0.6886 - val_accuracy: 0.7914 Epoch 32/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6414 - accuracy: 0.8040 - val_loss: 0.6491 - val_accuracy: 0.8052 Epoch 33/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6281 - accuracy: 0.8071 - val_loss: 0.6617 - val_accuracy: 0.8004 Epoch 34/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6262 - accuracy: 0.8075 - val_loss: 0.6189 - val_accuracy: 0.8124 Epoch 35/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6165 - accuracy: 0.8086 - val_loss: 0.6685 - val_accuracy: 0.8004 Epoch 36/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6241 - accuracy: 0.8084 - val_loss: 0.6714 - val_accuracy: 0.8027 Epoch 37/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6131 - accuracy: 0.8112 - val_loss: 0.6616 - val_accuracy: 0.8016 Epoch 38/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6115 - accuracy: 0.8126 - val_loss: 0.6552 - val_accuracy: 0.8031 Epoch 39/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6193 - accuracy: 0.8098 - val_loss: 0.6125 - val_accuracy: 0.8116 Epoch 40/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5870 - accuracy: 0.8186 - val_loss: 0.6714 - val_accuracy: 0.7985 Epoch 41/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6041 - accuracy: 0.8154 - val_loss: 0.5917 - val_accuracy: 0.8211 Epoch 42/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5951 - accuracy: 0.8154 - val_loss: 0.6676 - val_accuracy: 0.7986 Epoch 43/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5942 - accuracy: 0.8160 - val_loss: 0.7248 - val_accuracy: 0.7815 Epoch 44/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5827 - accuracy: 0.8219 - val_loss: 0.5585 - val_accuracy: 0.8314 Epoch 45/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5731 - accuracy: 0.8235 - val_loss: 0.6426 - val_accuracy: 0.8088 Epoch 46/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5793 - accuracy: 0.8229 - val_loss: 0.6674 - val_accuracy: 0.7973 Epoch 47/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5756 - accuracy: 0.8236 - val_loss: 0.6337 - val_accuracy: 0.8189 Epoch 48/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5645 - accuracy: 0.8258 - val_loss: 0.5776 - val_accuracy: 0.8236 Epoch 49/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5641 - accuracy: 0.8256 - val_loss: 0.6347 - val_accuracy: 0.8108 Epoch 50/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5775 - accuracy: 0.8225 - val_loss: 0.5942 - val_accuracy: 0.8215 Epoch 51/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5703 - accuracy: 0.8251 - val_loss: 0.6040 - val_accuracy: 0.8217 Epoch 52/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5628 - accuracy: 0.8269 - val_loss: 0.5432 - val_accuracy: 0.8379 Epoch 53/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5650 - accuracy: 0.8273 - val_loss: 0.5768 - val_accuracy: 0.8296 Epoch 54/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5514 - accuracy: 0.8289 - val_loss: 0.5627 - val_accuracy: 0.8300 Epoch 55/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5429 - accuracy: 0.8323 - val_loss: 0.6098 - val_accuracy: 0.8177 Epoch 56/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5552 - accuracy: 0.8280 - val_loss: 0.5343 - val_accuracy: 0.8425 Epoch 57/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5477 - accuracy: 0.8305 - val_loss: 0.5828 - val_accuracy: 0.8299 Epoch 58/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5400 - accuracy: 0.8337 - val_loss: 0.5924 - val_accuracy: 0.8234 Epoch 59/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5407 - accuracy: 0.8314 - val_loss: 0.6172 - val_accuracy: 0.8150 Epoch 60/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5328 - accuracy: 0.8375 - val_loss: 0.5618 - val_accuracy: 0.8312 Epoch 61/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5401 - accuracy: 0.8350 - val_loss: 0.5088 - val_accuracy: 0.8500 Epoch 62/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5350 - accuracy: 0.8354 - val_loss: 0.5646 - val_accuracy: 0.8257 Epoch 63/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5399 - accuracy: 0.8333 - val_loss: 0.6275 - val_accuracy: 0.8126 Epoch 64/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5271 - accuracy: 0.8386 - val_loss: 0.5393 - val_accuracy: 0.8397 Epoch 65/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5291 - accuracy: 0.8361 - val_loss: 0.5967 - val_accuracy: 0.8276 Epoch 66/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5210 - accuracy: 0.8395 - val_loss: 0.5590 - val_accuracy: 0.8404 Epoch 67/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5281 - accuracy: 0.8386 - val_loss: 0.5741 - val_accuracy: 0.8296 Epoch 68/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5270 - accuracy: 0.8370 - val_loss: 0.6767 - val_accuracy: 0.8009 Epoch 69/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5231 - accuracy: 0.8390 - val_loss: 0.5591 - val_accuracy: 0.8316 Epoch 70/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5117 - accuracy: 0.8412 - val_loss: 0.5357 - val_accuracy: 0.8418 Epoch 71/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5165 - accuracy: 0.8426 - val_loss: 0.5925 - val_accuracy: 0.8256
We can plot the training loss, validation loss vs number of epochs and training accuracy, validation accuracy vs number of epochs using Matplotlib.
plt.figure(figsize=(16,5))
# History of accuracy score
plt.subplot(1, 2, 1)
plt.plot(hist.history['accuracy'])
plt.plot(hist.history['val_accuracy'])
plt.title('Evolution of model accuracy score by epochs')
plt.ylabel('accuracy_score')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='lower right')
#---------------------------------
# History of Loss
plt.subplot(1, 2, 2)
plt.plot(hist.history['loss'])
plt.plot(hist.history['val_loss'])
plt.title('Evolution of Loss (Categorical Cross-Entropy) by epochs')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='upper right')
plt.tight_layout()
To evaluate the performance of the model, we will use the evaluate() function, which returns the loss value and metrics values for the model on the test data.
# Obtain cross-entropy loss and accuracy scores on validation dataset
loss, accuracy = model.evaluate(X_val, y_val_encoded)
print('Validation cross-entropy Loss:', loss)
print('Validation classification Accuracy:', accuracy)
1875/1875 [==============================] - 2s 729us/step - loss: 2.3376 - accuracy: 0.1056 Validation cross-entropy Loss: 2.3376057147979736 Validation classification Accuracy: 0.10563333332538605
# Obtain cross-entropy loss and accuracy scores on test dataset
loss, accuracy = model.evaluate(X_test, y_test_encoded)
print('Test cross-entropy Loss:', loss)
print('Test classification Accuracy:', accuracy)
563/563 [==============================] - 1s 983us/step - loss: 2.3399 - accuracy: 0.1053 Test cross-entropy Loss: 2.3399112224578857 Test classification Accuracy: 0.10527777671813965
As noted above, with batch size of 64 and Adam optimizer with learning rate of
0.003, we get validation and test accuracies of ~83% and ~81% respectively
• After around 15 epochs the training and validation accuracy more or less
stabilize and improves gradually
• There is not much of gap between training and validation curves - hence no clear
evidence of overfit
• However, loss and accuracy results on validation dataset slightly more volatile vs
training dataset
• Next I increase hidd en layers to let test accuracy score improve - try to overfit
first and then use regularization
# Initialize Neural Network (Sequential) model
model2 = tf.keras.Sequential()
# Reshape the input of 32 x 32 image into 1d array with 1024 features
model2.add(tf.keras.layers.Reshape(target_shape=(1024,), input_shape=(32,32,)))
# Add Layer 1 with 256 neurons and Leaky-ReLU activation function
model2.add(tf.keras.layers.Dense(units=256,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL1'))
# Add Layer 2 with 128 neurons and Leaky-ReLU activation function
model2.add(tf.keras.layers.Dense(units=128,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL2'))
# Add Layer 3 with 64 neurons and Leaky-ReLU activation function
model2.add(tf.keras.layers.Dense(units=64,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL3'))
# Add Layer 4 with 32 neurons and Leaky-ReLU activation function
model2.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL4'))
# Add Layer 5 with 32 neurons and Leaky-ReLU activation function
model2.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL5'))
# Output Layer with 10 neurons and softmax activation function
model2.add(tf.keras.layers.Dense(units=10, activation='softmax', name='Output'))
# compile model
model2.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.003),
loss='categorical_crossentropy', metrics=['accuracy'])
# Set early stopping criteria (i.e., no improvement in validation loss in 10 successive epochs)
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10,
restore_best_weights=True, mode='min')
# Train model
hist2 = model2.fit(X_train, y_train_encoded, batch_size=64, epochs=150, verbose=1,
validation_data=(X_val, y_val_encoded), callbacks=[callback])
Epoch 1/150 657/657 [==============================] - 2s 3ms/step - loss: 2.3136 - accuracy: 0.1011 - val_loss: 2.3124 - val_accuracy: 0.1051 Epoch 2/150 657/657 [==============================] - 2s 3ms/step - loss: 2.2848 - accuracy: 0.1202 - val_loss: 2.2161 - val_accuracy: 0.1408 Epoch 3/150 657/657 [==============================] - 2s 3ms/step - loss: 1.8020 - accuracy: 0.3430 - val_loss: 1.6653 - val_accuracy: 0.4109 Epoch 4/150 657/657 [==============================] - 2s 3ms/step - loss: 1.4341 - accuracy: 0.5068 - val_loss: 1.3010 - val_accuracy: 0.5572 Epoch 5/150 657/657 [==============================] - 2s 3ms/step - loss: 1.2702 - accuracy: 0.5842 - val_loss: 1.1758 - val_accuracy: 0.6202 Epoch 6/150 657/657 [==============================] - 2s 3ms/step - loss: 1.1475 - accuracy: 0.6335 - val_loss: 1.1776 - val_accuracy: 0.6313 Epoch 7/150 657/657 [==============================] - 2s 3ms/step - loss: 1.0584 - accuracy: 0.6647 - val_loss: 0.9900 - val_accuracy: 0.6940 Epoch 8/150 657/657 [==============================] - 2s 3ms/step - loss: 1.0099 - accuracy: 0.6806 - val_loss: 0.9996 - val_accuracy: 0.6800 Epoch 9/150 657/657 [==============================] - 2s 3ms/step - loss: 0.9597 - accuracy: 0.6985 - val_loss: 0.9193 - val_accuracy: 0.7175 Epoch 10/150 657/657 [==============================] - 2s 3ms/step - loss: 0.9306 - accuracy: 0.7080 - val_loss: 0.9867 - val_accuracy: 0.6808 Epoch 11/150 657/657 [==============================] - 2s 3ms/step - loss: 0.8988 - accuracy: 0.7176 - val_loss: 0.8058 - val_accuracy: 0.7539 Epoch 12/150 657/657 [==============================] - 2s 3ms/step - loss: 0.8525 - accuracy: 0.7344 - val_loss: 1.0196 - val_accuracy: 0.6882 Epoch 13/150 657/657 [==============================] - 2s 3ms/step - loss: 0.8420 - accuracy: 0.7363 - val_loss: 0.8405 - val_accuracy: 0.7373 Epoch 14/150 657/657 [==============================] - 2s 3ms/step - loss: 0.8064 - accuracy: 0.7495 - val_loss: 0.9274 - val_accuracy: 0.7089 Epoch 15/150 657/657 [==============================] - 2s 4ms/step - loss: 0.7919 - accuracy: 0.7558 - val_loss: 0.7564 - val_accuracy: 0.7641 Epoch 16/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7684 - accuracy: 0.7614 - val_loss: 0.7763 - val_accuracy: 0.7572 Epoch 17/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7561 - accuracy: 0.7640 - val_loss: 0.7289 - val_accuracy: 0.7717 Epoch 18/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7399 - accuracy: 0.7703 - val_loss: 0.7126 - val_accuracy: 0.7804 Epoch 19/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7189 - accuracy: 0.7801 - val_loss: 0.6951 - val_accuracy: 0.7867 Epoch 20/150 657/657 [==============================] - 2s 3ms/step - loss: 0.7207 - accuracy: 0.7759 - val_loss: 0.8065 - val_accuracy: 0.7377 Epoch 21/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6887 - accuracy: 0.7864 - val_loss: 0.6884 - val_accuracy: 0.7878 Epoch 22/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6825 - accuracy: 0.7872 - val_loss: 0.6860 - val_accuracy: 0.7887 Epoch 23/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6765 - accuracy: 0.7877 - val_loss: 0.6928 - val_accuracy: 0.7862 Epoch 24/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6556 - accuracy: 0.7959 - val_loss: 0.6733 - val_accuracy: 0.7941 Epoch 25/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6580 - accuracy: 0.7965 - val_loss: 0.6439 - val_accuracy: 0.8053 Epoch 26/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6467 - accuracy: 0.7995 - val_loss: 0.6363 - val_accuracy: 0.8051 Epoch 27/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6338 - accuracy: 0.8025 - val_loss: 0.6599 - val_accuracy: 0.8007 Epoch 28/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6266 - accuracy: 0.8063 - val_loss: 0.6130 - val_accuracy: 0.8121 Epoch 29/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6182 - accuracy: 0.8080 - val_loss: 0.6515 - val_accuracy: 0.7930 Epoch 30/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6051 - accuracy: 0.8107 - val_loss: 0.6572 - val_accuracy: 0.8000 Epoch 31/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6099 - accuracy: 0.8095 - val_loss: 0.6503 - val_accuracy: 0.7983 Epoch 32/150 657/657 [==============================] - 2s 3ms/step - loss: 0.6056 - accuracy: 0.8125 - val_loss: 0.6319 - val_accuracy: 0.8055 Epoch 33/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5963 - accuracy: 0.8151 - val_loss: 0.6488 - val_accuracy: 0.7977 Epoch 34/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5811 - accuracy: 0.8199 - val_loss: 0.6266 - val_accuracy: 0.8059 Epoch 35/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5752 - accuracy: 0.8225 - val_loss: 0.5396 - val_accuracy: 0.8338 Epoch 36/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5786 - accuracy: 0.8211 - val_loss: 0.5970 - val_accuracy: 0.8102 Epoch 37/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5702 - accuracy: 0.8245 - val_loss: 0.6369 - val_accuracy: 0.7986 Epoch 38/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5682 - accuracy: 0.8223 - val_loss: 0.5729 - val_accuracy: 0.8275 Epoch 39/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5474 - accuracy: 0.8291 - val_loss: 0.6662 - val_accuracy: 0.7946 Epoch 40/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5529 - accuracy: 0.8265 - val_loss: 0.5941 - val_accuracy: 0.8176 Epoch 41/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5440 - accuracy: 0.8303 - val_loss: 0.6138 - val_accuracy: 0.8126 Epoch 42/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5486 - accuracy: 0.8295 - val_loss: 0.5075 - val_accuracy: 0.8440 Epoch 43/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5466 - accuracy: 0.8282 - val_loss: 0.5747 - val_accuracy: 0.8260 Epoch 44/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5277 - accuracy: 0.8356 - val_loss: 0.5810 - val_accuracy: 0.8210 Epoch 45/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5314 - accuracy: 0.8336 - val_loss: 0.5260 - val_accuracy: 0.8384 Epoch 46/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5212 - accuracy: 0.8384 - val_loss: 0.5188 - val_accuracy: 0.8417 Epoch 47/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5220 - accuracy: 0.8391 - val_loss: 0.5576 - val_accuracy: 0.8271 Epoch 48/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5155 - accuracy: 0.8392 - val_loss: 0.5695 - val_accuracy: 0.8278 Epoch 49/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5188 - accuracy: 0.8383 - val_loss: 0.4960 - val_accuracy: 0.8502 Epoch 50/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5053 - accuracy: 0.8433 - val_loss: 0.6143 - val_accuracy: 0.8069 Epoch 51/150 657/657 [==============================] - 2s 3ms/step - loss: 0.5057 - accuracy: 0.8416 - val_loss: 0.5418 - val_accuracy: 0.8333 Epoch 52/150 657/657 [==============================] - 2s 3ms/step - loss: 0.4999 - accuracy: 0.8438 - val_loss: 0.5392 - val_accuracy: 0.8356 Epoch 53/150 657/657 [==============================] - 2s 3ms/step - loss: 0.4918 - accuracy: 0.8482 - val_loss: 0.5427 - val_accuracy: 0.8355 Epoch 54/150 657/657 [==============================] - 2s 3ms/step - loss: 0.4941 - accuracy: 0.8448 - val_loss: 0.4998 - val_accuracy: 0.8468 Epoch 55/150 657/657 [==============================] - 2s 4ms/step - loss: 0.4900 - accuracy: 0.8470 - val_loss: 0.5623 - val_accuracy: 0.8267 Epoch 56/150 657/657 [==============================] - 2s 3ms/step - loss: 0.4779 - accuracy: 0.8510 - val_loss: 0.6193 - val_accuracy: 0.8123 Epoch 57/150 657/657 [==============================] - 2s 3ms/step - loss: 0.4875 - accuracy: 0.8477 - val_loss: 0.5250 - val_accuracy: 0.8397 Epoch 58/150 657/657 [==============================] - 2s 3ms/step - loss: 0.4711 - accuracy: 0.8516 - val_loss: 0.5060 - val_accuracy: 0.8530 Epoch 59/150 657/657 [==============================] - 2s 3ms/step - loss: 0.4768 - accuracy: 0.8515 - val_loss: 0.5002 - val_accuracy: 0.8485
plt.figure(figsize=(16,5))
# History of accuracy score
plt.subplot(1, 2, 1)
plt.plot(hist2.history['accuracy'])
plt.plot(hist2.history['val_accuracy'])
plt.title('Evolution of model accuracy score by epochs')
plt.ylabel('accuracy_score')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='lower right')
#---------------------------------
# History of Loss
plt.subplot(1, 2, 2)
plt.plot(hist2.history['loss'])
plt.plot(hist2.history['val_loss'])
plt.title('Evolution of Loss (Categorical Cross-Entropy) by epochs')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='upper right')
plt.tight_layout()
To evaluate the performance of the model, we will use the evaluate() function, which returns the loss value and metrics values for the model on the test data.
# Obtain cross-entropy loss and accuracy scores on validation dataset
loss, accuracy = model2.evaluate(X_val, y_val_encoded)
print('Validation cross-entropy Loss:', loss)
print('Validation classification Accuracy:', accuracy)
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4960 - accuracy: 0.8502 Validation cross-entropy Loss: 0.4959568679332733 Validation classification Accuracy: 0.8501999974250793
# Obtain cross-entropy loss and accuracy scores on test dataset
loss, accuracy = model2.evaluate(X_test, y_test_encoded)
print('Test cross-entropy Loss:', loss)
print('Test classification Accuracy:', accuracy)
563/563 [==============================] - 0s 749us/step - loss: 0.6140 - accuracy: 0.8272 Test cross-entropy Loss: 0.6139991283416748 Test classification Accuracy: 0.8271666765213013
• With additional hidden layers (5 in all), the both validation accuracy (~86%) and
test accuracy (~83%) improve
• There seems to have some amount of overfitting sneaking through when I
compare training accuracy vs test accuracy
• Let's increase the hidden layers (to 7) and also increase batch size to 128
• Being SVHN data noisy, it is advisable to use smaller batch size - but given the
size of training data (42000), we can increase batch size to 128 to learn better
based on more training data in every epoch
# Initialize Neural Network (Sequential) model
model3 = tf.keras.Sequential()
# Reshape the input of 32 x 32 image into 1d array with 1024 features
model3.add(tf.keras.layers.Reshape(target_shape=(1024,), input_shape=(32,32,)))
# Add Layer 1 with 256 neurons and Leaky-ReLU activation function
model3.add(tf.keras.layers.Dense(units=256,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL1'))
# Add Layer 2 with 128 neurons and Leaky-ReLU activation function
model3.add(tf.keras.layers.Dense(units=128,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL2'))
# Add Layer 3 with 64 neurons and Leaky-ReLU activation function
model3.add(tf.keras.layers.Dense(units=64,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL3'))
# Add Layer 4 with 32 neurons and Leaky-ReLU activation function
model3.add(tf.keras.layers.Dense(units=64,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL4'))
# Add Layer 5 with 32 neurons and Leaky-ReLU activation function
model3.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL5'))
# Add Layer 6 with 16 neurons and Leaky-ReLU activation function
model3.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL6'))
# Add Layer 7 with 32 neurons and Leaky-ReLU activation function
model3.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL7'))
# Output Layer with 10 neurons and softmax activation function
model3.add(tf.keras.layers.Dense(units=10, activation='softmax', name='Output'))
# compile model
model3.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.003),
loss='categorical_crossentropy', metrics=['accuracy'])
# Set early stopping criteria (i.e., no improvement in validation loss in 10 successive epochs)
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10,
restore_best_weights=True, mode='min')
# Train model
hist3 = model3.fit(X_train, y_train_encoded, batch_size=128, epochs=150, verbose=1,
validation_data=(X_val, y_val_encoded), callbacks=[callback])
Epoch 1/150 329/329 [==============================] - 2s 4ms/step - loss: 2.3101 - accuracy: 0.1011 - val_loss: 2.3040 - val_accuracy: 0.0997 Epoch 2/150 329/329 [==============================] - 2s 5ms/step - loss: 2.3050 - accuracy: 0.0983 - val_loss: 2.3012 - val_accuracy: 0.1000 Epoch 3/150 329/329 [==============================] - 1s 4ms/step - loss: 2.3033 - accuracy: 0.1019 - val_loss: 2.3098 - val_accuracy: 0.1012 Epoch 4/150 329/329 [==============================] - 1s 4ms/step - loss: 2.3039 - accuracy: 0.0978 - val_loss: 2.3027 - val_accuracy: 0.1007 Epoch 5/150 329/329 [==============================] - 1s 4ms/step - loss: 2.8177 - accuracy: 0.0999 - val_loss: 2.3076 - val_accuracy: 0.1014 Epoch 6/150 329/329 [==============================] - 1s 4ms/step - loss: 2.3121 - accuracy: 0.1035 - val_loss: 2.3460 - val_accuracy: 0.1000 Epoch 7/150 329/329 [==============================] - 1s 4ms/step - loss: 2.3096 - accuracy: 0.0993 - val_loss: 2.3036 - val_accuracy: 0.1031 Epoch 8/150 329/329 [==============================] - 2s 5ms/step - loss: 2.3052 - accuracy: 0.1022 - val_loss: 2.3052 - val_accuracy: 0.1006 Epoch 9/150 329/329 [==============================] - 1s 4ms/step - loss: 2.3044 - accuracy: 0.1010 - val_loss: 2.3037 - val_accuracy: 0.1019 Epoch 10/150 329/329 [==============================] - 1s 4ms/step - loss: 2.3038 - accuracy: 0.1004 - val_loss: 2.3030 - val_accuracy: 0.1022 Epoch 11/150 329/329 [==============================] - 1s 4ms/step - loss: 2.2860 - accuracy: 0.1178 - val_loss: 2.2712 - val_accuracy: 0.1295 Epoch 12/150 329/329 [==============================] - 1s 4ms/step - loss: 2.1844 - accuracy: 0.1597 - val_loss: 2.1289 - val_accuracy: 0.1759 Epoch 13/150 329/329 [==============================] - 1s 4ms/step - loss: 2.1416 - accuracy: 0.1763 - val_loss: 2.1387 - val_accuracy: 0.1814 Epoch 14/150 329/329 [==============================] - 1s 4ms/step - loss: 2.0940 - accuracy: 0.2054 - val_loss: 2.0573 - val_accuracy: 0.2228 Epoch 15/150 329/329 [==============================] - 1s 4ms/step - loss: 1.9614 - accuracy: 0.2816 - val_loss: 1.9528 - val_accuracy: 0.2928 Epoch 16/150 329/329 [==============================] - 1s 5ms/step - loss: 1.6768 - accuracy: 0.3981 - val_loss: 1.5407 - val_accuracy: 0.4478 Epoch 17/150 329/329 [==============================] - 1s 4ms/step - loss: 1.5097 - accuracy: 0.4719 - val_loss: 1.4559 - val_accuracy: 0.5113 Epoch 18/150 329/329 [==============================] - 1s 4ms/step - loss: 1.3606 - accuracy: 0.5434 - val_loss: 1.5966 - val_accuracy: 0.4963 Epoch 19/150 329/329 [==============================] - 1s 4ms/step - loss: 1.2586 - accuracy: 0.5893 - val_loss: 1.4146 - val_accuracy: 0.5114 Epoch 20/150 329/329 [==============================] - 1s 4ms/step - loss: 1.2125 - accuracy: 0.6077 - val_loss: 1.1457 - val_accuracy: 0.6267 Epoch 21/150 329/329 [==============================] - 1s 4ms/step - loss: 1.1376 - accuracy: 0.6362 - val_loss: 1.1260 - val_accuracy: 0.6389 Epoch 22/150 329/329 [==============================] - 1s 4ms/step - loss: 1.0878 - accuracy: 0.6539 - val_loss: 1.1152 - val_accuracy: 0.6439 Epoch 23/150 329/329 [==============================] - 1s 4ms/step - loss: 1.0268 - accuracy: 0.6788 - val_loss: 1.1126 - val_accuracy: 0.6428 Epoch 24/150 329/329 [==============================] - 1s 4ms/step - loss: 0.9900 - accuracy: 0.6893 - val_loss: 0.9319 - val_accuracy: 0.7108 Epoch 25/150 329/329 [==============================] - 1s 4ms/step - loss: 0.9453 - accuracy: 0.7049 - val_loss: 1.0287 - val_accuracy: 0.6714 Epoch 26/150 329/329 [==============================] - 1s 4ms/step - loss: 0.9372 - accuracy: 0.7057 - val_loss: 0.9107 - val_accuracy: 0.7154 Epoch 27/150 329/329 [==============================] - 1s 4ms/step - loss: 0.8900 - accuracy: 0.7214 - val_loss: 0.9448 - val_accuracy: 0.7055 Epoch 28/150 329/329 [==============================] - 1s 4ms/step - loss: 0.8681 - accuracy: 0.7284 - val_loss: 0.8570 - val_accuracy: 0.7312 Epoch 29/150 329/329 [==============================] - 1s 4ms/step - loss: 0.8344 - accuracy: 0.7382 - val_loss: 0.8893 - val_accuracy: 0.7165 Epoch 30/150 329/329 [==============================] - 1s 4ms/step - loss: 0.8204 - accuracy: 0.7426 - val_loss: 0.8447 - val_accuracy: 0.7314 Epoch 31/150 329/329 [==============================] - 1s 4ms/step - loss: 0.7873 - accuracy: 0.7545 - val_loss: 0.7935 - val_accuracy: 0.7527 Epoch 32/150 329/329 [==============================] - 1s 4ms/step - loss: 0.7719 - accuracy: 0.7589 - val_loss: 0.8446 - val_accuracy: 0.7395 Epoch 33/150 329/329 [==============================] - 1s 4ms/step - loss: 0.7670 - accuracy: 0.7595 - val_loss: 0.7799 - val_accuracy: 0.7593 Epoch 34/150 329/329 [==============================] - 1s 4ms/step - loss: 0.7475 - accuracy: 0.7655 - val_loss: 0.7444 - val_accuracy: 0.7708 Epoch 35/150 329/329 [==============================] - 1s 4ms/step - loss: 0.7238 - accuracy: 0.7730 - val_loss: 0.8447 - val_accuracy: 0.7333 Epoch 36/150 329/329 [==============================] - 1s 4ms/step - loss: 0.7253 - accuracy: 0.7716 - val_loss: 0.7417 - val_accuracy: 0.7687 Epoch 37/150 329/329 [==============================] - 1s 4ms/step - loss: 0.7113 - accuracy: 0.7801 - val_loss: 0.7171 - val_accuracy: 0.7802 Epoch 38/150 329/329 [==============================] - 1s 4ms/step - loss: 0.6843 - accuracy: 0.7886 - val_loss: 0.8336 - val_accuracy: 0.7403 Epoch 39/150 329/329 [==============================] - 1s 4ms/step - loss: 0.6557 - accuracy: 0.7954 - val_loss: 0.8050 - val_accuracy: 0.7583 Epoch 40/150 329/329 [==============================] - 1s 4ms/step - loss: 0.6492 - accuracy: 0.7969 - val_loss: 0.6692 - val_accuracy: 0.7969 Epoch 41/150 329/329 [==============================] - 1s 4ms/step - loss: 0.6510 - accuracy: 0.7971 - val_loss: 0.7570 - val_accuracy: 0.7638 Epoch 42/150 329/329 [==============================] - 1s 4ms/step - loss: 0.6432 - accuracy: 0.7980 - val_loss: 0.6900 - val_accuracy: 0.7877 Epoch 43/150 329/329 [==============================] - 2s 5ms/step - loss: 0.6271 - accuracy: 0.8018 - val_loss: 0.6935 - val_accuracy: 0.7861 Epoch 44/150 329/329 [==============================] - 1s 4ms/step - loss: 0.6184 - accuracy: 0.8079 - val_loss: 0.7420 - val_accuracy: 0.7689 Epoch 45/150 329/329 [==============================] - 1s 4ms/step - loss: 0.6152 - accuracy: 0.8090 - val_loss: 0.7784 - val_accuracy: 0.7565 Epoch 46/150 329/329 [==============================] - 1s 4ms/step - loss: 0.6044 - accuracy: 0.8123 - val_loss: 0.6502 - val_accuracy: 0.7961 Epoch 47/150 329/329 [==============================] - 2s 5ms/step - loss: 0.5942 - accuracy: 0.8135 - val_loss: 0.5770 - val_accuracy: 0.8222 Epoch 48/150 329/329 [==============================] - 2s 5ms/step - loss: 0.5814 - accuracy: 0.8182 - val_loss: 0.6851 - val_accuracy: 0.7860 Epoch 49/150 329/329 [==============================] - 1s 5ms/step - loss: 0.5753 - accuracy: 0.8214 - val_loss: 0.5930 - val_accuracy: 0.8160 Epoch 50/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5739 - accuracy: 0.8218 - val_loss: 0.6106 - val_accuracy: 0.8125 Epoch 51/150 329/329 [==============================] - 2s 5ms/step - loss: 0.5602 - accuracy: 0.8241 - val_loss: 0.6330 - val_accuracy: 0.8021 Epoch 52/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5547 - accuracy: 0.8262 - val_loss: 0.6693 - val_accuracy: 0.7984 Epoch 53/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5557 - accuracy: 0.8233 - val_loss: 0.6247 - val_accuracy: 0.8054 Epoch 54/150 329/329 [==============================] - 2s 5ms/step - loss: 0.5483 - accuracy: 0.8274 - val_loss: 0.6176 - val_accuracy: 0.8143 Epoch 55/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5360 - accuracy: 0.8322 - val_loss: 0.6372 - val_accuracy: 0.8087 Epoch 56/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5270 - accuracy: 0.8345 - val_loss: 0.6652 - val_accuracy: 0.7935 Epoch 57/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5269 - accuracy: 0.8358 - val_loss: 0.5323 - val_accuracy: 0.8363 Epoch 58/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5214 - accuracy: 0.8370 - val_loss: 0.6019 - val_accuracy: 0.8184 Epoch 59/150 329/329 [==============================] - 2s 5ms/step - loss: 0.5230 - accuracy: 0.8363 - val_loss: 0.6002 - val_accuracy: 0.8195 Epoch 60/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5133 - accuracy: 0.8399 - val_loss: 0.6167 - val_accuracy: 0.8090 Epoch 61/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5146 - accuracy: 0.8399 - val_loss: 0.5759 - val_accuracy: 0.8206 Epoch 62/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5042 - accuracy: 0.8417 - val_loss: 0.4841 - val_accuracy: 0.8521 Epoch 63/150 329/329 [==============================] - 1s 4ms/step - loss: 0.5059 - accuracy: 0.8416 - val_loss: 0.5539 - val_accuracy: 0.8295 Epoch 64/150 329/329 [==============================] - 1s 4ms/step - loss: 0.4913 - accuracy: 0.8463 - val_loss: 0.5381 - val_accuracy: 0.8380 Epoch 65/150 329/329 [==============================] - 1s 4ms/step - loss: 0.4860 - accuracy: 0.8484 - val_loss: 0.5973 - val_accuracy: 0.8104 Epoch 66/150 329/329 [==============================] - 1s 4ms/step - loss: 0.4872 - accuracy: 0.8465 - val_loss: 0.5119 - val_accuracy: 0.8441 Epoch 67/150 329/329 [==============================] - 1s 4ms/step - loss: 0.4816 - accuracy: 0.8499 - val_loss: 0.5467 - val_accuracy: 0.8327 Epoch 68/150 329/329 [==============================] - 1s 4ms/step - loss: 0.4745 - accuracy: 0.8513 - val_loss: 0.4937 - val_accuracy: 0.8512 Epoch 69/150 329/329 [==============================] - 1s 4ms/step - loss: 0.4762 - accuracy: 0.8513 - val_loss: 0.5446 - val_accuracy: 0.8338 Epoch 70/150 329/329 [==============================] - 1s 4ms/step - loss: 0.4539 - accuracy: 0.8556 - val_loss: 0.5340 - val_accuracy: 0.8346 Epoch 71/150 329/329 [==============================] - 1s 4ms/step - loss: 0.4638 - accuracy: 0.8530 - val_loss: 0.5313 - val_accuracy: 0.8386 Epoch 72/150 329/329 [==============================] - 1s 4ms/step - loss: 0.4653 - accuracy: 0.8546 - val_loss: 0.5013 - val_accuracy: 0.8466
We can plot the training loss, validation loss vs number of epochs and training accuracy, validation accuracy vs number of epochs using Matplotlib.
plt.figure(figsize=(16,5))
# History of accuracy score
plt.subplot(1, 2, 1)
plt.plot(hist3.history['accuracy'])
plt.plot(hist3.history['val_accuracy'])
plt.title('Evolution of model accuracy score by epochs')
plt.ylabel('accuracy_score')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='lower right')
#---------------------------------
# History of Loss
plt.subplot(1, 2, 2)
plt.plot(hist3.history['loss'])
plt.plot(hist3.history['val_loss'])
plt.title('Evolution of Loss (Categorical Cross-Entropy) by epochs')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='upper right')
plt.tight_layout()
To evaluate the performance of the model, we will use the evaluate() function, which returns the loss value and metrics values for the model on the test data.
# Obtain cross-entropy loss and accuracy scores on validation dataset
loss, accuracy = model3.evaluate(X_val, y_val_encoded)
print('Validation cross-entropy Loss:', loss)
print('Validation classification Accuracy:', accuracy)
1875/1875 [==============================] - 2s 1ms/step - loss: 0.4841 - accuracy: 0.8521 Validation cross-entropy Loss: 0.484103262424469 Validation classification Accuracy: 0.8520500063896179
# Obtain cross-entropy loss and accuracy scores on test dataset
loss, accuracy = model3.evaluate(X_test, y_test_encoded)
print('Test cross-entropy Loss:', loss)
print('Test classification Accuracy:', accuracy)
563/563 [==============================] - 0s 769us/step - loss: 0.5887 - accuracy: 0.8238 Test cross-entropy Loss: 0.5886817574501038 Test classification Accuracy: 0.8237777948379517
• Based on batch size of 128 and 7 hidden layers, I end up with higher validation accuracy (~87%) and test accuracy (~84). • There seems to have certain degree of overfitting • We can now add dropout layers after hidden layers to regularize the model and hence helpful in avoiding overfitting • We can also add batch normalization for further regularization
# Initialize Neural Network (Sequential) model
model4 = tf.keras.Sequential()
# Reshape the input of 32 x 32 image into 1d array with 1024 features
model4.add(tf.keras.layers.Reshape(target_shape=(1024,), input_shape=(32,32,)))
# Add BatchNormalization layer
model4.add(tf.keras.layers.BatchNormalization())
# Add Layer 1 with 256 neurons and Leaky-ReLU activation function
model4.add(tf.keras.layers.Dense(units=256,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL1'))
# Dropout for regularization to prevent overefitting
model4.add(tf.keras.layers.Dropout(rate=0.20))
# Add Layer 2 with 128 neurons and Leaky-ReLU activation function
model4.add(tf.keras.layers.Dense(units=128,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL2'))
# Dropout for regularization to prevent overefitting
model4.add(tf.keras.layers.Dropout(rate=0.15))
# Add Layer 3 with 64 neurons and Leaky-ReLU activation function
model4.add(tf.keras.layers.Dense(units=64,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL3'))
# Dropout for regularization to prevent overefitting
model4.add(tf.keras.layers.Dropout(rate=0.05))
# Add Layer 4 with 64 neurons and Leaky-ReLU activation function
model4.add(tf.keras.layers.Dense(units=64,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL4'))
# Dropout for regularization to prevent overefitting
model4.add(tf.keras.layers.Dropout(rate=0.05))
# Add Layer 5 with 32 neurons and Leaky-ReLU activation function
model4.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL5'))
# Dropout for regularization to prevent overefitting
model4.add(tf.keras.layers.Dropout(rate=0.05))
# Add Layer 6 with 32 neurons and Leaky-ReLU activation function
model4.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL6'))
# Dropout for regularization to prevent overefitting
model4.add(tf.keras.layers.Dropout(rate=0.05))
# Add Layer 7 with 32 neurons and Leaky-ReLU activation function
model4.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL7'))
# Dropout for regularization to prevent overefitting
model4.add(tf.keras.layers.Dropout(rate=0.05))
# Output Layer with 10 neurons and softmax activation function
model4.add(tf.keras.layers.Dense(units=10, activation='softmax', name='Output'))
# compile model
model4.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.003),
loss='categorical_crossentropy', metrics=['accuracy'])
# Set early stopping criteria (i.e., no improvement in validation loss in 10 successive epochs)
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10,
restore_best_weights=True, mode='min')
# Train model
hist4 = model4.fit(X_train, y_train_encoded, batch_size=128, epochs=150, verbose=1,
validation_data=(X_val, y_val_encoded), callbacks=[callback])
Epoch 1/150 329/329 [==============================] - 3s 7ms/step - loss: 1.8339 - accuracy: 0.3361 - val_loss: 1.2870 - val_accuracy: 0.5780 Epoch 2/150 329/329 [==============================] - 2s 7ms/step - loss: 1.3098 - accuracy: 0.5747 - val_loss: 1.0278 - val_accuracy: 0.6729 Epoch 3/150 329/329 [==============================] - 2s 6ms/step - loss: 1.1230 - accuracy: 0.6478 - val_loss: 0.9043 - val_accuracy: 0.7210 Epoch 4/150 329/329 [==============================] - 2s 6ms/step - loss: 1.0180 - accuracy: 0.6829 - val_loss: 0.8161 - val_accuracy: 0.7471 Epoch 5/150 329/329 [==============================] - 2s 6ms/step - loss: 0.9502 - accuracy: 0.7049 - val_loss: 0.7572 - val_accuracy: 0.7677 Epoch 6/150 329/329 [==============================] - 2s 7ms/step - loss: 0.9030 - accuracy: 0.7220 - val_loss: 0.7263 - val_accuracy: 0.7762 Epoch 7/150 329/329 [==============================] - 2s 6ms/step - loss: 0.8683 - accuracy: 0.7342 - val_loss: 0.7134 - val_accuracy: 0.7786 Epoch 8/150 329/329 [==============================] - 2s 6ms/step - loss: 0.8454 - accuracy: 0.7415 - val_loss: 0.6416 - val_accuracy: 0.8080 Epoch 9/150 329/329 [==============================] - 2s 6ms/step - loss: 0.8252 - accuracy: 0.7473 - val_loss: 0.6372 - val_accuracy: 0.8123 Epoch 10/150 329/329 [==============================] - 2s 7ms/step - loss: 0.8097 - accuracy: 0.7506 - val_loss: 0.6107 - val_accuracy: 0.8195 Epoch 11/150 329/329 [==============================] - 2s 7ms/step - loss: 0.7824 - accuracy: 0.7605 - val_loss: 0.6444 - val_accuracy: 0.8052 Epoch 12/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7704 - accuracy: 0.7645 - val_loss: 0.6282 - val_accuracy: 0.8044 Epoch 13/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7621 - accuracy: 0.7659 - val_loss: 0.5601 - val_accuracy: 0.8284 Epoch 14/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7547 - accuracy: 0.7715 - val_loss: 0.6159 - val_accuracy: 0.8151 Epoch 15/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7355 - accuracy: 0.7758 - val_loss: 0.5553 - val_accuracy: 0.8320 Epoch 16/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7328 - accuracy: 0.7755 - val_loss: 0.5581 - val_accuracy: 0.8283 Epoch 17/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7232 - accuracy: 0.7797 - val_loss: 0.5419 - val_accuracy: 0.8427 Epoch 18/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7204 - accuracy: 0.7798 - val_loss: 0.5402 - val_accuracy: 0.8413 Epoch 19/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7088 - accuracy: 0.7864 - val_loss: 0.5493 - val_accuracy: 0.8344 Epoch 20/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7020 - accuracy: 0.7871 - val_loss: 0.5458 - val_accuracy: 0.8347 Epoch 21/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6917 - accuracy: 0.7899 - val_loss: 0.5382 - val_accuracy: 0.8393 Epoch 22/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6982 - accuracy: 0.7891 - val_loss: 0.5184 - val_accuracy: 0.8434 Epoch 23/150 329/329 [==============================] - 2s 6ms/step - loss: 0.7005 - accuracy: 0.7868 - val_loss: 0.5573 - val_accuracy: 0.8297 Epoch 24/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6914 - accuracy: 0.7911 - val_loss: 0.5304 - val_accuracy: 0.8437 Epoch 25/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6839 - accuracy: 0.7929 - val_loss: 0.5419 - val_accuracy: 0.8383 Epoch 26/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6744 - accuracy: 0.7938 - val_loss: 0.5181 - val_accuracy: 0.8464 Epoch 27/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6713 - accuracy: 0.7962 - val_loss: 0.5288 - val_accuracy: 0.8409 Epoch 28/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6607 - accuracy: 0.7993 - val_loss: 0.5194 - val_accuracy: 0.8437 Epoch 29/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6684 - accuracy: 0.7973 - val_loss: 0.5016 - val_accuracy: 0.8494 Epoch 30/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6556 - accuracy: 0.8023 - val_loss: 0.5007 - val_accuracy: 0.8475 Epoch 31/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6497 - accuracy: 0.8039 - val_loss: 0.4869 - val_accuracy: 0.8527 Epoch 32/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6529 - accuracy: 0.8028 - val_loss: 0.4865 - val_accuracy: 0.8521 Epoch 33/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6373 - accuracy: 0.8075 - val_loss: 0.4810 - val_accuracy: 0.8566 Epoch 34/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6375 - accuracy: 0.8068 - val_loss: 0.4656 - val_accuracy: 0.8607 Epoch 35/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6444 - accuracy: 0.8057 - val_loss: 0.4815 - val_accuracy: 0.8557 Epoch 36/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6356 - accuracy: 0.8063 - val_loss: 0.4640 - val_accuracy: 0.8616 Epoch 37/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6446 - accuracy: 0.8031 - val_loss: 0.4819 - val_accuracy: 0.8529 Epoch 38/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6261 - accuracy: 0.8109 - val_loss: 0.4526 - val_accuracy: 0.8663 Epoch 39/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6354 - accuracy: 0.8070 - val_loss: 0.4805 - val_accuracy: 0.8587 Epoch 40/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6320 - accuracy: 0.8099 - val_loss: 0.4718 - val_accuracy: 0.8594 Epoch 41/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6308 - accuracy: 0.8090 - val_loss: 0.4463 - val_accuracy: 0.8647 Epoch 42/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6290 - accuracy: 0.8098 - val_loss: 0.4596 - val_accuracy: 0.8621 Epoch 43/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6128 - accuracy: 0.8148 - val_loss: 0.4370 - val_accuracy: 0.8691 Epoch 44/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6160 - accuracy: 0.8126 - val_loss: 0.4614 - val_accuracy: 0.8607 Epoch 45/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6144 - accuracy: 0.8137 - val_loss: 0.4564 - val_accuracy: 0.8628 Epoch 46/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6188 - accuracy: 0.8134 - val_loss: 0.4176 - val_accuracy: 0.8743 Epoch 47/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6104 - accuracy: 0.8165 - val_loss: 0.4358 - val_accuracy: 0.8699 Epoch 48/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5942 - accuracy: 0.8196 - val_loss: 0.4918 - val_accuracy: 0.8574 Epoch 49/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6098 - accuracy: 0.8180 - val_loss: 0.4387 - val_accuracy: 0.8683 Epoch 50/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6045 - accuracy: 0.8173 - val_loss: 0.4387 - val_accuracy: 0.8700 Epoch 51/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6126 - accuracy: 0.8152 - val_loss: 0.4481 - val_accuracy: 0.8668 Epoch 52/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5985 - accuracy: 0.8186 - val_loss: 0.4284 - val_accuracy: 0.8729 Epoch 53/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5886 - accuracy: 0.8223 - val_loss: 0.4323 - val_accuracy: 0.8731 Epoch 54/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6018 - accuracy: 0.8193 - val_loss: 0.4148 - val_accuracy: 0.8785 Epoch 55/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5974 - accuracy: 0.8204 - val_loss: 0.4391 - val_accuracy: 0.8680 Epoch 56/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5859 - accuracy: 0.8224 - val_loss: 0.4176 - val_accuracy: 0.8760 Epoch 57/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6016 - accuracy: 0.8188 - val_loss: 0.4350 - val_accuracy: 0.8703 Epoch 58/150 329/329 [==============================] - 2s 6ms/step - loss: 0.6013 - accuracy: 0.8202 - val_loss: 0.4134 - val_accuracy: 0.8787 Epoch 59/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5935 - accuracy: 0.8208 - val_loss: 0.4049 - val_accuracy: 0.8798 Epoch 60/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5921 - accuracy: 0.8233 - val_loss: 0.4278 - val_accuracy: 0.8744 Epoch 61/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5845 - accuracy: 0.8259 - val_loss: 0.4169 - val_accuracy: 0.8745 Epoch 62/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5987 - accuracy: 0.8207 - val_loss: 0.4385 - val_accuracy: 0.8686 Epoch 63/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5823 - accuracy: 0.8254 - val_loss: 0.4220 - val_accuracy: 0.8765 Epoch 64/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5877 - accuracy: 0.8230 - val_loss: 0.4175 - val_accuracy: 0.8760 Epoch 65/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5831 - accuracy: 0.8229 - val_loss: 0.4263 - val_accuracy: 0.8779 Epoch 66/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5814 - accuracy: 0.8260 - val_loss: 0.4140 - val_accuracy: 0.8767 Epoch 67/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5799 - accuracy: 0.8257 - val_loss: 0.4082 - val_accuracy: 0.8780 Epoch 68/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5710 - accuracy: 0.8269 - val_loss: 0.3940 - val_accuracy: 0.8829 Epoch 69/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5762 - accuracy: 0.8272 - val_loss: 0.4064 - val_accuracy: 0.8829 Epoch 70/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5720 - accuracy: 0.8284 - val_loss: 0.4178 - val_accuracy: 0.8764 Epoch 71/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5849 - accuracy: 0.8238 - val_loss: 0.3917 - val_accuracy: 0.8848 Epoch 72/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5770 - accuracy: 0.8273 - val_loss: 0.4162 - val_accuracy: 0.8756 Epoch 73/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5833 - accuracy: 0.8240 - val_loss: 0.4046 - val_accuracy: 0.8803 Epoch 74/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5728 - accuracy: 0.8277 - val_loss: 0.4258 - val_accuracy: 0.8707 Epoch 75/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5667 - accuracy: 0.8307 - val_loss: 0.3920 - val_accuracy: 0.8859 Epoch 76/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5644 - accuracy: 0.8291 - val_loss: 0.3969 - val_accuracy: 0.8870 Epoch 77/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5689 - accuracy: 0.8284 - val_loss: 0.3880 - val_accuracy: 0.8848 Epoch 78/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5638 - accuracy: 0.8306 - val_loss: 0.4090 - val_accuracy: 0.8787 Epoch 79/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5725 - accuracy: 0.8283 - val_loss: 0.4013 - val_accuracy: 0.8793 Epoch 80/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5676 - accuracy: 0.8285 - val_loss: 0.3907 - val_accuracy: 0.8887 Epoch 81/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5616 - accuracy: 0.8317 - val_loss: 0.4125 - val_accuracy: 0.8789 Epoch 82/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5577 - accuracy: 0.8323 - val_loss: 0.4006 - val_accuracy: 0.8807 Epoch 83/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5633 - accuracy: 0.8303 - val_loss: 0.3918 - val_accuracy: 0.8829 Epoch 84/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5606 - accuracy: 0.8319 - val_loss: 0.3979 - val_accuracy: 0.8827 Epoch 85/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5596 - accuracy: 0.8314 - val_loss: 0.3981 - val_accuracy: 0.8830 Epoch 86/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5573 - accuracy: 0.8314 - val_loss: 0.4049 - val_accuracy: 0.8825 Epoch 87/150 329/329 [==============================] - 2s 6ms/step - loss: 0.5580 - accuracy: 0.8315 - val_loss: 0.4234 - val_accuracy: 0.8744
We can plot the training loss, validation loss vs number of epochs and training accuracy, validation accuracy vs number of epochs using Matplotlib.
plt.figure(figsize=(16,5))
# History of accuracy score
plt.subplot(1, 2, 1)
plt.plot(hist4.history['accuracy'])
plt.plot(hist4.history['val_accuracy'])
plt.title('Evolution of model accuracy score by epochs')
plt.ylabel('accuracy_score')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='lower right')
#---------------------------------
# History of Loss
plt.subplot(1, 2, 2)
plt.plot(hist4.history['loss'])
plt.plot(hist4.history['val_loss'])
plt.title('Evolution of Loss (Categorical Cross-Entropy) by epochs')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='upper right')
plt.tight_layout()
To evaluate the performance of the model, we will use the evaluate() function, which returns the loss value and metrics values for the model on the test data
# Obtain cross-entropy loss and accuracy scores on validation dataset
loss, accuracy = model4.evaluate(X_val, y_val_encoded)
print('Validation cross-entropy Loss:', loss)
print('Validation classification Accuracy:', accuracy)
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3880 - accuracy: 0.8848 Validation cross-entropy Loss: 0.3879580497741699 Validation classification Accuracy: 0.8847500085830688
# Obtain cross-entropy loss and accuracy scores on test dataset
loss, accuracy = model4.evaluate(X_test, y_test_encoded)
print('Test cross-entropy Loss:', loss)
print('Test classification Accuracy:', accuracy)
563/563 [==============================] - 0s 856us/step - loss: 0.4742 - accuracy: 0.8624 Test cross-entropy Loss: 0.4742424190044403 Test classification Accuracy: 0.8624444603919983
Adding dropout rates after every hidden layer does improve accuracy for both
validation and test datasets (accuracies ~88% and ~86% respectively)
• Notably, validation accuracy (loss) is consistently higher (lower) vs training
accuracy (loss). This is somewhat counterintuitive - even if we compare the
results for the cases without any dropout layers
• However, since we are using dropout layer after every hidden layer, we are
disabling number of neurons past every hidden layer => fraction of info about
each sample/feature is lost, and the subsequent layers attempt to construct
predictions using "incomplete" representations. Thus we are making it artificially
harder for the network to give right answers. Hence lower training accuracy.
• On the contrary, during validation no dropout layers are considered and all of the
units are available for prediction. Thus the network has "complete" info to predict
class membership => higher validation accuracy
• Next I include batch normalization after every layer to check if validation and
test accuracy improve further
# Initialize Neural Network (Sequential) model
model5 = tf.keras.Sequential()
# Reshape the input of 32 x 32 image into 1d array with 1024 features
model5.add(tf.keras.layers.Reshape(target_shape=(1024,), input_shape=(32,32,)))
# Add BatchNormalization layer
model5.add(tf.keras.layers.BatchNormalization())
# Add Layer 1 with 256 neurons and Leaky-ReLU activation function
model5.add(tf.keras.layers.Dense(units=256,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL1'))
# Add BatchNormalization layer
model5.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model5.add(tf.keras.layers.Dropout(rate=0.20))
# Add Layer 2 with 128 neurons and Leaky-ReLU activation function
model5.add(tf.keras.layers.Dense(units=128,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL2'))
# Add BatchNormalization layer
model5.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model5.add(tf.keras.layers.Dropout(rate=0.15))
# Add Layer 3 with 64 neurons and Leaky-ReLU activation function
model5.add(tf.keras.layers.Dense(units=64,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL3'))
# Add BatchNormalization layer
model5.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model5.add(tf.keras.layers.Dropout(rate=0.05))
# Add Layer 4 with 64 neurons and Leaky-ReLU activation function
model5.add(tf.keras.layers.Dense(units=64,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL4'))
# Add BatchNormalization layer
model5.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model5.add(tf.keras.layers.Dropout(rate=0.05))
# Add Layer 5 with 32 neurons and Leaky-ReLU activation function
model5.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL5'))
# Add BatchNormalization layer
model5.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model5.add(tf.keras.layers.Dropout(rate=0.05))
# Add Layer 6 with 32 neurons and Leaky-ReLU activation function
model5.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL6'))
# Add BatchNormalization layer
model5.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model5.add(tf.keras.layers.Dropout(rate=0.05))
# Add Layer 7 with 32 neurons and Leaky-ReLU activation function
model5.add(tf.keras.layers.Dense(units=32,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL7'))
# Add BatchNormalization layer
model5.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model5.add(tf.keras.layers.Dropout(rate=0.05))
# Output Layer with 10 neurons and softmax activation function
model5.add(tf.keras.layers.Dense(units=10, activation='softmax', name='Output'))
# compile model
model5.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.003),
loss='categorical_crossentropy', metrics=['accuracy'])
# Set early stopping criteria (i.e., no improvement in validation loss in 10 successive epochs)
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10,
restore_best_weights=True, mode='min')
# Train model
hist5 = model5.fit(X_train, y_train_encoded, batch_size=128, epochs=150, verbose=1,
validation_data=(X_val, y_val_encoded), callbacks=[callback])
Epoch 1/150 329/329 [==============================] - 4s 8ms/step - loss: 1.7608 - accuracy: 0.3806 - val_loss: 1.2507 - val_accuracy: 0.5961 Epoch 2/150 329/329 [==============================] - 3s 8ms/step - loss: 1.3056 - accuracy: 0.5760 - val_loss: 1.0130 - val_accuracy: 0.6798 Epoch 3/150 329/329 [==============================] - 3s 8ms/step - loss: 1.1482 - accuracy: 0.6378 - val_loss: 0.9924 - val_accuracy: 0.6896 Epoch 4/150 329/329 [==============================] - 3s 9ms/step - loss: 1.0614 - accuracy: 0.6699 - val_loss: 0.8427 - val_accuracy: 0.7340 Epoch 5/150 329/329 [==============================] - 3s 8ms/step - loss: 0.9952 - accuracy: 0.6913 - val_loss: 0.7786 - val_accuracy: 0.7594 Epoch 6/150 329/329 [==============================] - 3s 8ms/step - loss: 0.9462 - accuracy: 0.7059 - val_loss: 0.7583 - val_accuracy: 0.7644 Epoch 7/150 329/329 [==============================] - 2s 7ms/step - loss: 0.9012 - accuracy: 0.7210 - val_loss: 0.6971 - val_accuracy: 0.7844 Epoch 8/150 329/329 [==============================] - 3s 8ms/step - loss: 0.8656 - accuracy: 0.7331 - val_loss: 0.7203 - val_accuracy: 0.7760 Epoch 9/150 329/329 [==============================] - 3s 8ms/step - loss: 0.8441 - accuracy: 0.7387 - val_loss: 0.6442 - val_accuracy: 0.8029 Epoch 10/150 329/329 [==============================] - 3s 8ms/step - loss: 0.8228 - accuracy: 0.7455 - val_loss: 0.6544 - val_accuracy: 0.7990 Epoch 11/150 329/329 [==============================] - 3s 8ms/step - loss: 0.7999 - accuracy: 0.7544 - val_loss: 0.6217 - val_accuracy: 0.8074 Epoch 12/150 329/329 [==============================] - 3s 8ms/step - loss: 0.7859 - accuracy: 0.7559 - val_loss: 0.6128 - val_accuracy: 0.8099 Epoch 13/150 329/329 [==============================] - 3s 8ms/step - loss: 0.7668 - accuracy: 0.7628 - val_loss: 0.5997 - val_accuracy: 0.8159 Epoch 14/150 329/329 [==============================] - 2s 7ms/step - loss: 0.7568 - accuracy: 0.7671 - val_loss: 0.5813 - val_accuracy: 0.8221 Epoch 15/150 329/329 [==============================] - 3s 8ms/step - loss: 0.7445 - accuracy: 0.7696 - val_loss: 0.5581 - val_accuracy: 0.8301 Epoch 16/150 329/329 [==============================] - 2s 7ms/step - loss: 0.7373 - accuracy: 0.7730 - val_loss: 0.6334 - val_accuracy: 0.8069 Epoch 17/150 329/329 [==============================] - 2s 7ms/step - loss: 0.7423 - accuracy: 0.7717 - val_loss: 0.5514 - val_accuracy: 0.8327 Epoch 18/150 329/329 [==============================] - 2s 7ms/step - loss: 0.7075 - accuracy: 0.7829 - val_loss: 0.5193 - val_accuracy: 0.8424 Epoch 19/150 329/329 [==============================] - 3s 8ms/step - loss: 0.7072 - accuracy: 0.7806 - val_loss: 0.5343 - val_accuracy: 0.8355 Epoch 20/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6985 - accuracy: 0.7835 - val_loss: 0.5166 - val_accuracy: 0.8414 Epoch 21/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6930 - accuracy: 0.7855 - val_loss: 0.4939 - val_accuracy: 0.8487 Epoch 22/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6881 - accuracy: 0.7868 - val_loss: 0.4903 - val_accuracy: 0.8507 Epoch 23/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6782 - accuracy: 0.7888 - val_loss: 0.4973 - val_accuracy: 0.8490 Epoch 24/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6753 - accuracy: 0.7920 - val_loss: 0.5028 - val_accuracy: 0.8449 Epoch 25/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6709 - accuracy: 0.7904 - val_loss: 0.5086 - val_accuracy: 0.8438 Epoch 26/150 329/329 [==============================] - 2s 8ms/step - loss: 0.6563 - accuracy: 0.7965 - val_loss: 0.5155 - val_accuracy: 0.8413 Epoch 27/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6600 - accuracy: 0.7945 - val_loss: 0.4702 - val_accuracy: 0.8545 Epoch 28/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6477 - accuracy: 0.7994 - val_loss: 0.4836 - val_accuracy: 0.8505 Epoch 29/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6556 - accuracy: 0.7977 - val_loss: 0.4595 - val_accuracy: 0.8610 Epoch 30/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6466 - accuracy: 0.8002 - val_loss: 0.4638 - val_accuracy: 0.8582 Epoch 31/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6360 - accuracy: 0.8034 - val_loss: 0.4659 - val_accuracy: 0.8565 Epoch 32/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6324 - accuracy: 0.8030 - val_loss: 0.4592 - val_accuracy: 0.8598 Epoch 33/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6251 - accuracy: 0.8078 - val_loss: 0.4688 - val_accuracy: 0.8553 Epoch 34/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6190 - accuracy: 0.8082 - val_loss: 0.4452 - val_accuracy: 0.8636 Epoch 35/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6246 - accuracy: 0.8053 - val_loss: 0.4554 - val_accuracy: 0.8601 Epoch 36/150 329/329 [==============================] - 2s 7ms/step - loss: 0.6191 - accuracy: 0.8098 - val_loss: 0.4446 - val_accuracy: 0.8639 Epoch 37/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6169 - accuracy: 0.8098 - val_loss: 0.4511 - val_accuracy: 0.8602 Epoch 38/150 329/329 [==============================] - 2s 8ms/step - loss: 0.6128 - accuracy: 0.8093 - val_loss: 0.4311 - val_accuracy: 0.8694 Epoch 39/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6078 - accuracy: 0.8106 - val_loss: 0.4223 - val_accuracy: 0.8714 Epoch 40/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6048 - accuracy: 0.8106 - val_loss: 0.4393 - val_accuracy: 0.8663 Epoch 41/150 329/329 [==============================] - 3s 8ms/step - loss: 0.6115 - accuracy: 0.8094 - val_loss: 0.4418 - val_accuracy: 0.8647 Epoch 42/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5992 - accuracy: 0.8142 - val_loss: 0.4171 - val_accuracy: 0.8712 Epoch 43/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5913 - accuracy: 0.8160 - val_loss: 0.4134 - val_accuracy: 0.8753 Epoch 44/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5923 - accuracy: 0.8150 - val_loss: 0.4311 - val_accuracy: 0.8677 Epoch 45/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5935 - accuracy: 0.8164 - val_loss: 0.4223 - val_accuracy: 0.8709 Epoch 46/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5926 - accuracy: 0.8174 - val_loss: 0.4066 - val_accuracy: 0.8764 Epoch 47/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5854 - accuracy: 0.8181 - val_loss: 0.4178 - val_accuracy: 0.8705 Epoch 48/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5845 - accuracy: 0.8187 - val_loss: 0.4180 - val_accuracy: 0.8710 Epoch 49/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5804 - accuracy: 0.8198 - val_loss: 0.4080 - val_accuracy: 0.8753 Epoch 50/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5842 - accuracy: 0.8169 - val_loss: 0.4132 - val_accuracy: 0.8747 Epoch 51/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5780 - accuracy: 0.8206 - val_loss: 0.4233 - val_accuracy: 0.8698 Epoch 52/150 329/329 [==============================] - 2s 8ms/step - loss: 0.5763 - accuracy: 0.8217 - val_loss: 0.3963 - val_accuracy: 0.8782 Epoch 53/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5744 - accuracy: 0.8221 - val_loss: 0.3945 - val_accuracy: 0.8802 Epoch 54/150 329/329 [==============================] - 2s 8ms/step - loss: 0.5704 - accuracy: 0.8226 - val_loss: 0.3889 - val_accuracy: 0.8806 Epoch 55/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5656 - accuracy: 0.8268 - val_loss: 0.3904 - val_accuracy: 0.8802 Epoch 56/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5664 - accuracy: 0.8234 - val_loss: 0.3864 - val_accuracy: 0.8829 Epoch 57/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5660 - accuracy: 0.8234 - val_loss: 0.3928 - val_accuracy: 0.8804 Epoch 58/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5648 - accuracy: 0.8249 - val_loss: 0.3961 - val_accuracy: 0.8785 Epoch 59/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5596 - accuracy: 0.8253 - val_loss: 0.3951 - val_accuracy: 0.8796 Epoch 60/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5662 - accuracy: 0.8241 - val_loss: 0.3837 - val_accuracy: 0.8828 Epoch 61/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5519 - accuracy: 0.8282 - val_loss: 0.3778 - val_accuracy: 0.8849 Epoch 62/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5623 - accuracy: 0.8254 - val_loss: 0.3991 - val_accuracy: 0.8776 Epoch 63/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5498 - accuracy: 0.8302 - val_loss: 0.3978 - val_accuracy: 0.8784 Epoch 64/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5580 - accuracy: 0.8254 - val_loss: 0.3942 - val_accuracy: 0.8803 Epoch 65/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5573 - accuracy: 0.8270 - val_loss: 0.3805 - val_accuracy: 0.8845 Epoch 66/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5443 - accuracy: 0.8310 - val_loss: 0.3765 - val_accuracy: 0.8849 Epoch 67/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5530 - accuracy: 0.8287 - val_loss: 0.3757 - val_accuracy: 0.8859 Epoch 68/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5476 - accuracy: 0.8269 - val_loss: 0.3686 - val_accuracy: 0.8889 Epoch 69/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5421 - accuracy: 0.8330 - val_loss: 0.3667 - val_accuracy: 0.8878 Epoch 70/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5476 - accuracy: 0.8305 - val_loss: 0.3695 - val_accuracy: 0.8876 Epoch 71/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5409 - accuracy: 0.8330 - val_loss: 0.3744 - val_accuracy: 0.8861 Epoch 72/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5421 - accuracy: 0.8324 - val_loss: 0.3809 - val_accuracy: 0.8834 Epoch 73/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5368 - accuracy: 0.8326 - val_loss: 0.3672 - val_accuracy: 0.8876 Epoch 74/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5412 - accuracy: 0.8313 - val_loss: 0.3693 - val_accuracy: 0.8889 Epoch 75/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5342 - accuracy: 0.8340 - val_loss: 0.3527 - val_accuracy: 0.8931 Epoch 76/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5338 - accuracy: 0.8324 - val_loss: 0.3619 - val_accuracy: 0.8909 Epoch 77/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5351 - accuracy: 0.8348 - val_loss: 0.3809 - val_accuracy: 0.8839 Epoch 78/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5303 - accuracy: 0.8342 - val_loss: 0.3571 - val_accuracy: 0.8909 Epoch 79/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5301 - accuracy: 0.8339 - val_loss: 0.3556 - val_accuracy: 0.8929 Epoch 80/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5341 - accuracy: 0.8355 - val_loss: 0.3671 - val_accuracy: 0.8869 Epoch 81/150 329/329 [==============================] - 2s 8ms/step - loss: 0.5303 - accuracy: 0.8345 - val_loss: 0.3842 - val_accuracy: 0.8823 Epoch 82/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5229 - accuracy: 0.8368 - val_loss: 0.3547 - val_accuracy: 0.8931 Epoch 83/150 329/329 [==============================] - 2s 8ms/step - loss: 0.5279 - accuracy: 0.8345 - val_loss: 0.3508 - val_accuracy: 0.8935 Epoch 84/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5269 - accuracy: 0.8359 - val_loss: 0.3515 - val_accuracy: 0.8952 Epoch 85/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5276 - accuracy: 0.8353 - val_loss: 0.3578 - val_accuracy: 0.8911 Epoch 86/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5233 - accuracy: 0.8364 - val_loss: 0.3521 - val_accuracy: 0.8925 Epoch 87/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5208 - accuracy: 0.8379 - val_loss: 0.3515 - val_accuracy: 0.8941 Epoch 88/150 329/329 [==============================] - 2s 8ms/step - loss: 0.5268 - accuracy: 0.8371 - val_loss: 0.3498 - val_accuracy: 0.8940 Epoch 89/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5191 - accuracy: 0.8376 - val_loss: 0.3712 - val_accuracy: 0.8878 Epoch 90/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5213 - accuracy: 0.8357 - val_loss: 0.3478 - val_accuracy: 0.8941 Epoch 91/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5212 - accuracy: 0.8376 - val_loss: 0.3441 - val_accuracy: 0.8952 Epoch 92/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5115 - accuracy: 0.8400 - val_loss: 0.3520 - val_accuracy: 0.8928 Epoch 93/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5215 - accuracy: 0.8379 - val_loss: 0.3437 - val_accuracy: 0.8963 Epoch 94/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5163 - accuracy: 0.8378 - val_loss: 0.3467 - val_accuracy: 0.8939 Epoch 95/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5118 - accuracy: 0.8407 - val_loss: 0.3442 - val_accuracy: 0.8946 Epoch 96/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5130 - accuracy: 0.8410 - val_loss: 0.3590 - val_accuracy: 0.8906 Epoch 97/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5063 - accuracy: 0.8431 - val_loss: 0.3483 - val_accuracy: 0.8940 Epoch 98/150 329/329 [==============================] - 2s 8ms/step - loss: 0.5097 - accuracy: 0.8407 - val_loss: 0.3407 - val_accuracy: 0.8969 Epoch 99/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5143 - accuracy: 0.8391 - val_loss: 0.3471 - val_accuracy: 0.8939 Epoch 100/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5104 - accuracy: 0.8421 - val_loss: 0.3445 - val_accuracy: 0.8963 Epoch 101/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5115 - accuracy: 0.8403 - val_loss: 0.3345 - val_accuracy: 0.8991 Epoch 102/150 329/329 [==============================] - 2s 8ms/step - loss: 0.5031 - accuracy: 0.8431 - val_loss: 0.3503 - val_accuracy: 0.8946 Epoch 103/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5062 - accuracy: 0.8424 - val_loss: 0.3328 - val_accuracy: 0.8990 Epoch 104/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5021 - accuracy: 0.8429 - val_loss: 0.3419 - val_accuracy: 0.8967 Epoch 105/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5049 - accuracy: 0.8439 - val_loss: 0.3558 - val_accuracy: 0.8902 Epoch 106/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5029 - accuracy: 0.8429 - val_loss: 0.3414 - val_accuracy: 0.8971 Epoch 107/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5055 - accuracy: 0.8415 - val_loss: 0.3289 - val_accuracy: 0.9011 Epoch 108/150 329/329 [==============================] - 2s 7ms/step - loss: 0.5087 - accuracy: 0.8424 - val_loss: 0.3314 - val_accuracy: 0.9002 Epoch 109/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5025 - accuracy: 0.8430 - val_loss: 0.3282 - val_accuracy: 0.9024 Epoch 110/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5013 - accuracy: 0.8424 - val_loss: 0.3338 - val_accuracy: 0.8988 Epoch 111/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4996 - accuracy: 0.8430 - val_loss: 0.3371 - val_accuracy: 0.8982 Epoch 112/150 329/329 [==============================] - 2s 8ms/step - loss: 0.4979 - accuracy: 0.8434 - val_loss: 0.3318 - val_accuracy: 0.8996 Epoch 113/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4950 - accuracy: 0.8453 - val_loss: 0.3515 - val_accuracy: 0.8925 Epoch 114/150 329/329 [==============================] - 3s 8ms/step - loss: 0.5064 - accuracy: 0.8437 - val_loss: 0.3327 - val_accuracy: 0.9001 Epoch 115/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4938 - accuracy: 0.8471 - val_loss: 0.3214 - val_accuracy: 0.9037 Epoch 116/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4961 - accuracy: 0.8462 - val_loss: 0.3246 - val_accuracy: 0.9020 Epoch 117/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4979 - accuracy: 0.8440 - val_loss: 0.3249 - val_accuracy: 0.9022 Epoch 118/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4923 - accuracy: 0.8468 - val_loss: 0.3330 - val_accuracy: 0.8990 Epoch 119/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4992 - accuracy: 0.8443 - val_loss: 0.3380 - val_accuracy: 0.8986 Epoch 120/150 329/329 [==============================] - 2s 8ms/step - loss: 0.4935 - accuracy: 0.8480 - val_loss: 0.3217 - val_accuracy: 0.9025 Epoch 121/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4921 - accuracy: 0.8462 - val_loss: 0.3286 - val_accuracy: 0.8996 Epoch 122/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4941 - accuracy: 0.8459 - val_loss: 0.3410 - val_accuracy: 0.8969 Epoch 123/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4880 - accuracy: 0.8475 - val_loss: 0.3379 - val_accuracy: 0.8971 Epoch 124/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4861 - accuracy: 0.8493 - val_loss: 0.3189 - val_accuracy: 0.9039 Epoch 125/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4897 - accuracy: 0.8465 - val_loss: 0.3375 - val_accuracy: 0.8974 Epoch 126/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4916 - accuracy: 0.8455 - val_loss: 0.3187 - val_accuracy: 0.9045 Epoch 127/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4841 - accuracy: 0.8497 - val_loss: 0.3259 - val_accuracy: 0.9018 Epoch 128/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4951 - accuracy: 0.8438 - val_loss: 0.3212 - val_accuracy: 0.9031 Epoch 129/150 329/329 [==============================] - 2s 8ms/step - loss: 0.4871 - accuracy: 0.8463 - val_loss: 0.3197 - val_accuracy: 0.9037 Epoch 130/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4857 - accuracy: 0.8502 - val_loss: 0.3318 - val_accuracy: 0.9001 Epoch 131/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4927 - accuracy: 0.8440 - val_loss: 0.3202 - val_accuracy: 0.9030 Epoch 132/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4785 - accuracy: 0.8510 - val_loss: 0.3159 - val_accuracy: 0.9046 Epoch 133/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4795 - accuracy: 0.8498 - val_loss: 0.3213 - val_accuracy: 0.9024 Epoch 134/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4818 - accuracy: 0.8489 - val_loss: 0.3208 - val_accuracy: 0.9036 Epoch 135/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4890 - accuracy: 0.8482 - val_loss: 0.3181 - val_accuracy: 0.9045 Epoch 136/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4798 - accuracy: 0.8495 - val_loss: 0.3168 - val_accuracy: 0.9050 Epoch 137/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4849 - accuracy: 0.8480 - val_loss: 0.3220 - val_accuracy: 0.9028 Epoch 138/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4810 - accuracy: 0.8510 - val_loss: 0.3227 - val_accuracy: 0.9018 Epoch 139/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4862 - accuracy: 0.8483 - val_loss: 0.3127 - val_accuracy: 0.9052 Epoch 140/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4737 - accuracy: 0.8525 - val_loss: 0.3194 - val_accuracy: 0.9043 Epoch 141/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4923 - accuracy: 0.8467 - val_loss: 0.3150 - val_accuracy: 0.9041 Epoch 142/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4767 - accuracy: 0.8513 - val_loss: 0.3146 - val_accuracy: 0.9057 Epoch 143/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4762 - accuracy: 0.8505 - val_loss: 0.3273 - val_accuracy: 0.9024 Epoch 144/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4765 - accuracy: 0.8526 - val_loss: 0.3091 - val_accuracy: 0.9076 Epoch 145/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4823 - accuracy: 0.8487 - val_loss: 0.3192 - val_accuracy: 0.9037 Epoch 146/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4764 - accuracy: 0.8511 - val_loss: 0.3159 - val_accuracy: 0.9044 Epoch 147/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4708 - accuracy: 0.8527 - val_loss: 0.3005 - val_accuracy: 0.9102 Epoch 148/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4695 - accuracy: 0.8529 - val_loss: 0.3102 - val_accuracy: 0.9081 Epoch 149/150 329/329 [==============================] - 3s 8ms/step - loss: 0.4824 - accuracy: 0.8488 - val_loss: 0.3026 - val_accuracy: 0.9099 Epoch 150/150 329/329 [==============================] - 2s 7ms/step - loss: 0.4712 - accuracy: 0.8513 - val_loss: 0.3181 - val_accuracy: 0.9036
We can plot the training loss, validation loss vs number of epochs and training accuracy, validation accuracy vs number of epochs using Matplotlib.
plt.figure(figsize=(16,5))
# History of accuracy score
plt.subplot(1, 2, 1)
plt.plot(hist5.history['accuracy'])
plt.plot(hist5.history['val_accuracy'])
plt.title('Evolution of model accuracy score by epochs')
plt.ylabel('accuracy_score')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='lower right')
#---------------------------------
# History of Loss
plt.subplot(1, 2, 2)
plt.plot(hist5.history['loss'])
plt.plot(hist5.history['val_loss'])
plt.title('Evolution of Loss (Categorical Cross-Entropy) by epochs')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='upper right')
plt.tight_layout()
To evaluate the performance of the model, we will use the evaluate() function, which returns the loss value and metrics values for the model on the test data.
# Obtain cross-entropy loss and accuracy scores on validation dataset
loss, accuracy = model5.evaluate(X_val, y_val_encoded)
print('Validation cross-entropy Loss:', loss)
print('Validation classification Accuracy:', accuracy)
1875/1875 [==============================] - 3s 1ms/step - loss: 0.3181 - accuracy: 0.9036 Validation cross-entropy Loss: 0.318147212266922 Validation classification Accuracy: 0.9035500288009644
# Obtain cross-entropy loss and accuracy scores on test dataset
loss, accuracy = model5.evaluate(X_test, y_test_encoded)
print('Test cross-entropy Loss:', loss)
print('Test classification Accuracy:', accuracy)
563/563 [==============================] - 1s 976us/step - loss: 0.4242 - accuracy: 0.8714 Test cross-entropy Loss: 0.42421963810920715 Test classification Accuracy: 0.8713889122009277
• The above results presented clearly has the best validation and test accuracy scores as yet • Validation accuracy and Test accuracy are 90%+ and 87%+ respectively • As noted in the earlier iteration too, validation accuracy (loss) is consistently higher (lower) vs training accuracy (loss) here attributed mainly due to dropout layer invoked after every hidden layer • Now that we already have regularization parameters, let's consider a simpler model from parsimonious point of view • So I keep first three hidden layers, fine-tune the learning rate (to 0.0025) for Adam optimizer and increase epochs to 200 further keeping an early stopping rule of no improvement in validation loss in 10 successive epochs
# Initialize Neural Network (Sequential) model
model6 = tf.keras.Sequential()
# Reshape the input of 32 x 32 image into 1d array with 1024 features
model6.add(tf.keras.layers.Reshape(target_shape=(1024,), input_shape=(32,32,)))
# Add BatchNormalization layer
model6.add(tf.keras.layers.BatchNormalization())
# Add Layer 1 with 256 neurons and Leaky-ReLU activation function
model6.add(tf.keras.layers.Dense(units=256,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL1'))
# Add BatchNormalization layer
model6.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model6.add(tf.keras.layers.Dropout(rate=0.20))
# Add Layer 2 with 128 neurons and Leaky-ReLU activation function
model6.add(tf.keras.layers.Dense(units=128,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL2'))
# Add BatchNormalization layer
model6.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model6.add(tf.keras.layers.Dropout(rate=0.10))
# Add Layer 3 with 64 neurons and Leaky-ReLU activation function
model6.add(tf.keras.layers.Dense(units=64,
kernel_initializer=tf.keras.initializers.glorot_uniform(seed=7),
activation=tf.keras.layers.LeakyReLU(alpha=0.3),
name='HL3'))
# Add BatchNormalization layer
model6.add(tf.keras.layers.BatchNormalization())
# Dropout for regularization to prevent overefitting
model6.add(tf.keras.layers.Dropout(rate=0.05))
# Output Layer with 10 neurons and softmax activation function
model6.add(tf.keras.layers.Dense(units=10, activation='softmax', name='Output'))
# compile model
model6.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.0025),
loss='categorical_crossentropy', metrics=['accuracy'])
# Set early stopping criteria (i.e., no improvement in validation loss in 10 successive epochs)
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10,
restore_best_weights=True, mode='min')
# Train model
hist6 = model6.fit(X_train, y_train_encoded, batch_size=128, epochs=200, verbose=1,
validation_data=(X_val, y_val_encoded), callbacks=[callback])
Epoch 1/200 329/329 [==============================] - 3s 6ms/step - loss: 1.5678 - accuracy: 0.4781 - val_loss: 1.0936 - val_accuracy: 0.6743 Epoch 2/200 329/329 [==============================] - 2s 7ms/step - loss: 1.0911 - accuracy: 0.6576 - val_loss: 0.8408 - val_accuracy: 0.7491 Epoch 3/200 329/329 [==============================] - 2s 7ms/step - loss: 0.9731 - accuracy: 0.6977 - val_loss: 0.7706 - val_accuracy: 0.7652 Epoch 4/200 329/329 [==============================] - 2s 7ms/step - loss: 0.8854 - accuracy: 0.7272 - val_loss: 0.6960 - val_accuracy: 0.7891 Epoch 5/200 329/329 [==============================] - 2s 7ms/step - loss: 0.8370 - accuracy: 0.7407 - val_loss: 0.6660 - val_accuracy: 0.8011 Epoch 6/200 329/329 [==============================] - 2s 7ms/step - loss: 0.7937 - accuracy: 0.7548 - val_loss: 0.6226 - val_accuracy: 0.8140 Epoch 7/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7736 - accuracy: 0.7588 - val_loss: 0.6290 - val_accuracy: 0.8075 Epoch 8/200 329/329 [==============================] - 2s 7ms/step - loss: 0.7424 - accuracy: 0.7733 - val_loss: 0.5725 - val_accuracy: 0.8303 Epoch 9/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7169 - accuracy: 0.7793 - val_loss: 0.5654 - val_accuracy: 0.8302 Epoch 10/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7147 - accuracy: 0.7794 - val_loss: 0.5390 - val_accuracy: 0.8400 Epoch 11/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6953 - accuracy: 0.7869 - val_loss: 0.5385 - val_accuracy: 0.8382 Epoch 12/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6851 - accuracy: 0.7880 - val_loss: 0.5383 - val_accuracy: 0.8365 Epoch 13/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6649 - accuracy: 0.7937 - val_loss: 0.5085 - val_accuracy: 0.8476 Epoch 14/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6583 - accuracy: 0.7957 - val_loss: 0.5092 - val_accuracy: 0.8473 Epoch 15/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6547 - accuracy: 0.7963 - val_loss: 0.4924 - val_accuracy: 0.8540 Epoch 16/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6454 - accuracy: 0.8005 - val_loss: 0.4863 - val_accuracy: 0.8544 Epoch 17/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6357 - accuracy: 0.8024 - val_loss: 0.4657 - val_accuracy: 0.8641 Epoch 18/200 329/329 [==============================] - 2s 7ms/step - loss: 0.6280 - accuracy: 0.8055 - val_loss: 0.4749 - val_accuracy: 0.8576 Epoch 19/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6179 - accuracy: 0.8084 - val_loss: 0.4559 - val_accuracy: 0.8643 Epoch 20/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6102 - accuracy: 0.8109 - val_loss: 0.4738 - val_accuracy: 0.8559 Epoch 21/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6107 - accuracy: 0.8112 - val_loss: 0.4622 - val_accuracy: 0.8601 Epoch 22/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6031 - accuracy: 0.8138 - val_loss: 0.4518 - val_accuracy: 0.8667 Epoch 23/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5922 - accuracy: 0.8153 - val_loss: 0.4383 - val_accuracy: 0.8702 Epoch 24/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5901 - accuracy: 0.8180 - val_loss: 0.4618 - val_accuracy: 0.8598 Epoch 25/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5887 - accuracy: 0.8176 - val_loss: 0.4299 - val_accuracy: 0.8740 Epoch 26/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5789 - accuracy: 0.8202 - val_loss: 0.4321 - val_accuracy: 0.8723 Epoch 27/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5801 - accuracy: 0.8205 - val_loss: 0.4236 - val_accuracy: 0.8745 Epoch 28/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5743 - accuracy: 0.8202 - val_loss: 0.4241 - val_accuracy: 0.8739 Epoch 29/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5698 - accuracy: 0.8227 - val_loss: 0.4403 - val_accuracy: 0.8684 Epoch 30/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5662 - accuracy: 0.8250 - val_loss: 0.4375 - val_accuracy: 0.8684 Epoch 31/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5676 - accuracy: 0.8233 - val_loss: 0.4195 - val_accuracy: 0.8737 Epoch 32/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5600 - accuracy: 0.8247 - val_loss: 0.4224 - val_accuracy: 0.8731 Epoch 33/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5546 - accuracy: 0.8272 - val_loss: 0.4020 - val_accuracy: 0.8823 Epoch 34/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5470 - accuracy: 0.8291 - val_loss: 0.3929 - val_accuracy: 0.8848 Epoch 35/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5462 - accuracy: 0.8312 - val_loss: 0.4347 - val_accuracy: 0.8685 Epoch 36/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5435 - accuracy: 0.8299 - val_loss: 0.4129 - val_accuracy: 0.8779 Epoch 37/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5406 - accuracy: 0.8304 - val_loss: 0.3978 - val_accuracy: 0.8818 Epoch 38/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5479 - accuracy: 0.8291 - val_loss: 0.4108 - val_accuracy: 0.8780 Epoch 39/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5415 - accuracy: 0.8314 - val_loss: 0.3953 - val_accuracy: 0.8835 Epoch 40/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5292 - accuracy: 0.8348 - val_loss: 0.3863 - val_accuracy: 0.8852 Epoch 41/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5339 - accuracy: 0.8342 - val_loss: 0.3968 - val_accuracy: 0.8832 Epoch 42/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5343 - accuracy: 0.8325 - val_loss: 0.3992 - val_accuracy: 0.8818 Epoch 43/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5335 - accuracy: 0.8327 - val_loss: 0.3883 - val_accuracy: 0.8856 Epoch 44/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5283 - accuracy: 0.8355 - val_loss: 0.3754 - val_accuracy: 0.8897 Epoch 45/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5340 - accuracy: 0.8313 - val_loss: 0.3999 - val_accuracy: 0.8783 Epoch 46/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5179 - accuracy: 0.8371 - val_loss: 0.3863 - val_accuracy: 0.8862 Epoch 47/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5215 - accuracy: 0.8371 - val_loss: 0.3867 - val_accuracy: 0.8848 Epoch 48/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5125 - accuracy: 0.8398 - val_loss: 0.3819 - val_accuracy: 0.8858 Epoch 49/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5165 - accuracy: 0.8382 - val_loss: 0.3826 - val_accuracy: 0.8861 Epoch 50/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5228 - accuracy: 0.8389 - val_loss: 0.3751 - val_accuracy: 0.8901 Epoch 51/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5085 - accuracy: 0.8407 - val_loss: 0.3686 - val_accuracy: 0.8922 Epoch 52/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5111 - accuracy: 0.8388 - val_loss: 0.3630 - val_accuracy: 0.8939 Epoch 53/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5065 - accuracy: 0.8406 - val_loss: 0.3680 - val_accuracy: 0.8928 Epoch 54/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5031 - accuracy: 0.8425 - val_loss: 0.3743 - val_accuracy: 0.8907 Epoch 55/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4984 - accuracy: 0.8460 - val_loss: 0.3593 - val_accuracy: 0.8949 Epoch 56/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5104 - accuracy: 0.8408 - val_loss: 0.3615 - val_accuracy: 0.8942 Epoch 57/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4990 - accuracy: 0.8444 - val_loss: 0.3718 - val_accuracy: 0.8898 Epoch 58/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5010 - accuracy: 0.8424 - val_loss: 0.3567 - val_accuracy: 0.8953 Epoch 59/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4978 - accuracy: 0.8434 - val_loss: 0.3625 - val_accuracy: 0.8924 Epoch 60/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5006 - accuracy: 0.8423 - val_loss: 0.3562 - val_accuracy: 0.8955 Epoch 61/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4865 - accuracy: 0.8467 - val_loss: 0.3492 - val_accuracy: 0.8972 Epoch 62/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5023 - accuracy: 0.8433 - val_loss: 0.3675 - val_accuracy: 0.8914 Epoch 63/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4966 - accuracy: 0.8432 - val_loss: 0.3583 - val_accuracy: 0.8959 Epoch 64/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4874 - accuracy: 0.8483 - val_loss: 0.3505 - val_accuracy: 0.8972 Epoch 65/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4911 - accuracy: 0.8462 - val_loss: 0.3427 - val_accuracy: 0.9014 Epoch 66/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4879 - accuracy: 0.8477 - val_loss: 0.3533 - val_accuracy: 0.8964 Epoch 67/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4873 - accuracy: 0.8455 - val_loss: 0.3484 - val_accuracy: 0.8978 Epoch 68/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4811 - accuracy: 0.8477 - val_loss: 0.3445 - val_accuracy: 0.9000 Epoch 69/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4854 - accuracy: 0.8475 - val_loss: 0.3472 - val_accuracy: 0.8978 Epoch 70/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4780 - accuracy: 0.8493 - val_loss: 0.3403 - val_accuracy: 0.9010 Epoch 71/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4876 - accuracy: 0.8458 - val_loss: 0.3569 - val_accuracy: 0.8953 Epoch 72/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4821 - accuracy: 0.8466 - val_loss: 0.3478 - val_accuracy: 0.8986 Epoch 73/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4840 - accuracy: 0.8496 - val_loss: 0.3456 - val_accuracy: 0.8996 Epoch 74/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4757 - accuracy: 0.8482 - val_loss: 0.3494 - val_accuracy: 0.8969 Epoch 75/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4827 - accuracy: 0.8493 - val_loss: 0.3603 - val_accuracy: 0.8934 Epoch 76/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4838 - accuracy: 0.8482 - val_loss: 0.3495 - val_accuracy: 0.8976 Epoch 77/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4721 - accuracy: 0.8520 - val_loss: 0.3359 - val_accuracy: 0.9016 Epoch 78/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4748 - accuracy: 0.8510 - val_loss: 0.3357 - val_accuracy: 0.9022 Epoch 79/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4751 - accuracy: 0.8511 - val_loss: 0.3336 - val_accuracy: 0.9024 Epoch 80/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4767 - accuracy: 0.8514 - val_loss: 0.3427 - val_accuracy: 0.8997 Epoch 81/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4649 - accuracy: 0.8539 - val_loss: 0.3234 - val_accuracy: 0.9060 Epoch 82/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4690 - accuracy: 0.8523 - val_loss: 0.3315 - val_accuracy: 0.9038 Epoch 83/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4711 - accuracy: 0.8524 - val_loss: 0.3346 - val_accuracy: 0.9016 Epoch 84/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4705 - accuracy: 0.8527 - val_loss: 0.3254 - val_accuracy: 0.9057 Epoch 85/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4698 - accuracy: 0.8513 - val_loss: 0.3339 - val_accuracy: 0.9027 Epoch 86/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4687 - accuracy: 0.8510 - val_loss: 0.3370 - val_accuracy: 0.9018 Epoch 87/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4702 - accuracy: 0.8499 - val_loss: 0.3334 - val_accuracy: 0.9021 Epoch 88/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4635 - accuracy: 0.8532 - val_loss: 0.3405 - val_accuracy: 0.9002 Epoch 89/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4582 - accuracy: 0.8546 - val_loss: 0.3300 - val_accuracy: 0.9035 Epoch 90/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4634 - accuracy: 0.8531 - val_loss: 0.3365 - val_accuracy: 0.9008 Epoch 91/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4693 - accuracy: 0.8529 - val_loss: 0.3202 - val_accuracy: 0.9063 Epoch 92/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4563 - accuracy: 0.8546 - val_loss: 0.3224 - val_accuracy: 0.9067 Epoch 93/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4579 - accuracy: 0.8547 - val_loss: 0.3201 - val_accuracy: 0.9077 Epoch 94/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4564 - accuracy: 0.8553 - val_loss: 0.3186 - val_accuracy: 0.9067 Epoch 95/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4644 - accuracy: 0.8531 - val_loss: 0.3188 - val_accuracy: 0.9075 Epoch 96/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4509 - accuracy: 0.8581 - val_loss: 0.3287 - val_accuracy: 0.9033 Epoch 97/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4616 - accuracy: 0.8550 - val_loss: 0.3160 - val_accuracy: 0.9089 Epoch 98/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4545 - accuracy: 0.8560 - val_loss: 0.3199 - val_accuracy: 0.9065 Epoch 99/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4547 - accuracy: 0.8560 - val_loss: 0.3265 - val_accuracy: 0.9034 Epoch 100/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4570 - accuracy: 0.8549 - val_loss: 0.3233 - val_accuracy: 0.9046 Epoch 101/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4542 - accuracy: 0.8568 - val_loss: 0.3218 - val_accuracy: 0.9064 Epoch 102/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4577 - accuracy: 0.8545 - val_loss: 0.3191 - val_accuracy: 0.9065 Epoch 103/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4518 - accuracy: 0.8581 - val_loss: 0.3133 - val_accuracy: 0.9094 Epoch 104/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4496 - accuracy: 0.8580 - val_loss: 0.3125 - val_accuracy: 0.9101 Epoch 105/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4504 - accuracy: 0.8597 - val_loss: 0.3172 - val_accuracy: 0.9082 Epoch 106/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4539 - accuracy: 0.8567 - val_loss: 0.3295 - val_accuracy: 0.9030 Epoch 107/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4507 - accuracy: 0.8584 - val_loss: 0.3209 - val_accuracy: 0.9058 Epoch 108/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4497 - accuracy: 0.8574 - val_loss: 0.3219 - val_accuracy: 0.9051 Epoch 109/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4436 - accuracy: 0.8607 - val_loss: 0.3199 - val_accuracy: 0.9068 Epoch 110/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4488 - accuracy: 0.8585 - val_loss: 0.3273 - val_accuracy: 0.9056 Epoch 111/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4474 - accuracy: 0.8588 - val_loss: 0.3066 - val_accuracy: 0.9116 Epoch 112/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4497 - accuracy: 0.8574 - val_loss: 0.3147 - val_accuracy: 0.9089 Epoch 113/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4450 - accuracy: 0.8587 - val_loss: 0.3221 - val_accuracy: 0.9059 Epoch 114/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4475 - accuracy: 0.8603 - val_loss: 0.3101 - val_accuracy: 0.9106 Epoch 115/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4468 - accuracy: 0.8591 - val_loss: 0.3027 - val_accuracy: 0.9132 Epoch 116/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4409 - accuracy: 0.8591 - val_loss: 0.3081 - val_accuracy: 0.9107 Epoch 117/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4398 - accuracy: 0.8608 - val_loss: 0.3060 - val_accuracy: 0.9120 Epoch 118/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4435 - accuracy: 0.8577 - val_loss: 0.3013 - val_accuracy: 0.9137 Epoch 119/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4416 - accuracy: 0.8601 - val_loss: 0.3038 - val_accuracy: 0.9140 Epoch 120/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4372 - accuracy: 0.8626 - val_loss: 0.3058 - val_accuracy: 0.9118 Epoch 121/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4414 - accuracy: 0.8583 - val_loss: 0.2991 - val_accuracy: 0.9138 Epoch 122/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4400 - accuracy: 0.8609 - val_loss: 0.3024 - val_accuracy: 0.9120 Epoch 123/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4417 - accuracy: 0.8609 - val_loss: 0.3162 - val_accuracy: 0.9083 Epoch 124/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4355 - accuracy: 0.8613 - val_loss: 0.3019 - val_accuracy: 0.9132 Epoch 125/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4336 - accuracy: 0.8631 - val_loss: 0.3005 - val_accuracy: 0.9134 Epoch 126/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4373 - accuracy: 0.8610 - val_loss: 0.2996 - val_accuracy: 0.9146 Epoch 127/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4368 - accuracy: 0.8611 - val_loss: 0.2997 - val_accuracy: 0.9147 Epoch 128/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4416 - accuracy: 0.8606 - val_loss: 0.3045 - val_accuracy: 0.9121 Epoch 129/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4370 - accuracy: 0.8617 - val_loss: 0.3060 - val_accuracy: 0.9108 Epoch 130/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4285 - accuracy: 0.8637 - val_loss: 0.2973 - val_accuracy: 0.9145 Epoch 131/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4332 - accuracy: 0.8612 - val_loss: 0.3016 - val_accuracy: 0.9140 Epoch 132/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4304 - accuracy: 0.8638 - val_loss: 0.3030 - val_accuracy: 0.9126 Epoch 133/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4319 - accuracy: 0.8628 - val_loss: 0.3005 - val_accuracy: 0.9136 Epoch 134/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4300 - accuracy: 0.8652 - val_loss: 0.2929 - val_accuracy: 0.9164 Epoch 135/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4258 - accuracy: 0.8645 - val_loss: 0.3037 - val_accuracy: 0.9124 Epoch 136/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4360 - accuracy: 0.8596 - val_loss: 0.2947 - val_accuracy: 0.9156 Epoch 137/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4269 - accuracy: 0.8649 - val_loss: 0.3061 - val_accuracy: 0.9118 Epoch 138/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4335 - accuracy: 0.8618 - val_loss: 0.2915 - val_accuracy: 0.9163 Epoch 139/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4218 - accuracy: 0.8680 - val_loss: 0.2989 - val_accuracy: 0.9138 Epoch 140/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4301 - accuracy: 0.8647 - val_loss: 0.3041 - val_accuracy: 0.9106 Epoch 141/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4252 - accuracy: 0.8638 - val_loss: 0.2978 - val_accuracy: 0.9143 Epoch 142/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4246 - accuracy: 0.8663 - val_loss: 0.3063 - val_accuracy: 0.9117 Epoch 143/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4234 - accuracy: 0.8650 - val_loss: 0.2884 - val_accuracy: 0.9169 Epoch 144/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4197 - accuracy: 0.8680 - val_loss: 0.2927 - val_accuracy: 0.9162 Epoch 145/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4258 - accuracy: 0.8652 - val_loss: 0.2948 - val_accuracy: 0.9152 Epoch 146/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4259 - accuracy: 0.8649 - val_loss: 0.3017 - val_accuracy: 0.9118 Epoch 147/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4244 - accuracy: 0.8654 - val_loss: 0.3055 - val_accuracy: 0.9113 Epoch 148/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4246 - accuracy: 0.8659 - val_loss: 0.2955 - val_accuracy: 0.9139 Epoch 149/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4239 - accuracy: 0.8664 - val_loss: 0.2974 - val_accuracy: 0.9146 Epoch 150/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4152 - accuracy: 0.8681 - val_loss: 0.2962 - val_accuracy: 0.9153 Epoch 151/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4189 - accuracy: 0.8679 - val_loss: 0.2912 - val_accuracy: 0.9165 Epoch 152/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4248 - accuracy: 0.8645 - val_loss: 0.2928 - val_accuracy: 0.9154 Epoch 153/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4215 - accuracy: 0.8658 - val_loss: 0.2866 - val_accuracy: 0.9184 Epoch 154/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4175 - accuracy: 0.8682 - val_loss: 0.2832 - val_accuracy: 0.9199 Epoch 155/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4266 - accuracy: 0.8644 - val_loss: 0.2912 - val_accuracy: 0.9164 Epoch 156/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4149 - accuracy: 0.8673 - val_loss: 0.2868 - val_accuracy: 0.9186 Epoch 157/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4238 - accuracy: 0.8645 - val_loss: 0.2903 - val_accuracy: 0.9164 Epoch 158/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4264 - accuracy: 0.8655 - val_loss: 0.2811 - val_accuracy: 0.9209 Epoch 159/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4164 - accuracy: 0.8666 - val_loss: 0.2901 - val_accuracy: 0.9172 Epoch 160/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4205 - accuracy: 0.8651 - val_loss: 0.2897 - val_accuracy: 0.9169 Epoch 161/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4235 - accuracy: 0.8646 - val_loss: 0.2801 - val_accuracy: 0.9209 Epoch 162/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4144 - accuracy: 0.8666 - val_loss: 0.2876 - val_accuracy: 0.9175 Epoch 163/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4152 - accuracy: 0.8673 - val_loss: 0.2906 - val_accuracy: 0.9165 Epoch 164/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4104 - accuracy: 0.8702 - val_loss: 0.2804 - val_accuracy: 0.9212 Epoch 165/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4182 - accuracy: 0.8680 - val_loss: 0.2868 - val_accuracy: 0.9175 Epoch 166/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4175 - accuracy: 0.8669 - val_loss: 0.2823 - val_accuracy: 0.9197 Epoch 167/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4106 - accuracy: 0.8700 - val_loss: 0.2876 - val_accuracy: 0.9180 Epoch 168/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4130 - accuracy: 0.8678 - val_loss: 0.2838 - val_accuracy: 0.9194 Epoch 169/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4176 - accuracy: 0.8686 - val_loss: 0.2910 - val_accuracy: 0.9166 Epoch 170/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4091 - accuracy: 0.8700 - val_loss: 0.2827 - val_accuracy: 0.9193 Epoch 171/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4106 - accuracy: 0.8697 - val_loss: 0.2771 - val_accuracy: 0.9209 Epoch 172/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4099 - accuracy: 0.8695 - val_loss: 0.2892 - val_accuracy: 0.9169 Epoch 173/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4247 - accuracy: 0.8644 - val_loss: 0.2762 - val_accuracy: 0.9223 Epoch 174/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4080 - accuracy: 0.8698 - val_loss: 0.2829 - val_accuracy: 0.9207 Epoch 175/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4073 - accuracy: 0.8698 - val_loss: 0.2796 - val_accuracy: 0.9200 Epoch 176/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4157 - accuracy: 0.8671 - val_loss: 0.2855 - val_accuracy: 0.9181 Epoch 177/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4125 - accuracy: 0.8666 - val_loss: 0.2755 - val_accuracy: 0.9225 Epoch 178/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4092 - accuracy: 0.8704 - val_loss: 0.2897 - val_accuracy: 0.9169 Epoch 179/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4128 - accuracy: 0.8679 - val_loss: 0.2854 - val_accuracy: 0.9181 Epoch 180/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4073 - accuracy: 0.8700 - val_loss: 0.2790 - val_accuracy: 0.9203 Epoch 181/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4092 - accuracy: 0.8702 - val_loss: 0.2797 - val_accuracy: 0.9206 Epoch 182/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4061 - accuracy: 0.8692 - val_loss: 0.2778 - val_accuracy: 0.9210 Epoch 183/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4099 - accuracy: 0.8685 - val_loss: 0.2665 - val_accuracy: 0.9246 Epoch 184/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4134 - accuracy: 0.8672 - val_loss: 0.2876 - val_accuracy: 0.9175 Epoch 185/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4137 - accuracy: 0.8683 - val_loss: 0.2722 - val_accuracy: 0.9219 Epoch 186/200 329/329 [==============================] - 2s 6ms/step - loss: 0.3975 - accuracy: 0.8715 - val_loss: 0.2795 - val_accuracy: 0.9200 Epoch 187/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4111 - accuracy: 0.8691 - val_loss: 0.2761 - val_accuracy: 0.9217 Epoch 188/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4094 - accuracy: 0.8700 - val_loss: 0.2730 - val_accuracy: 0.9231 Epoch 189/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4045 - accuracy: 0.8703 - val_loss: 0.2734 - val_accuracy: 0.9222 Epoch 190/200 329/329 [==============================] - 2s 6ms/step - loss: 0.3973 - accuracy: 0.8727 - val_loss: 0.2738 - val_accuracy: 0.9225 Epoch 191/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4066 - accuracy: 0.8700 - val_loss: 0.2811 - val_accuracy: 0.9186 Epoch 192/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4026 - accuracy: 0.8737 - val_loss: 0.2779 - val_accuracy: 0.9215 Epoch 193/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4072 - accuracy: 0.8707 - val_loss: 0.2776 - val_accuracy: 0.9214
We can plot the training loss, validation loss vs number of epochs and training accuracy, validation accuracy vs number of epochs using Matplotlib.
plt.figure(figsize=(16,5))
# History of accuracy score
plt.subplot(1, 2, 1)
plt.plot(hist6.history['accuracy'])
plt.plot(hist6.history['val_accuracy'])
plt.title('Evolution of model accuracy score by epochs')
plt.ylabel('accuracy_score')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='lower right')
#---------------------------------
# History of Loss
plt.subplot(1, 2, 2)
plt.plot(hist6.history['loss'])
plt.plot(hist6.history['val_loss'])
plt.title('Evolution of Loss (Categorical Cross-Entropy) by epochs')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='upper right')
plt.tight_layout()
To evaluate the performance of the model, we will use the evaluate() function, which returns the loss value and metrics values for the model on the test data.
# Obtain cross-entropy loss and accuracy scores on validation dataset
loss, accuracy = model6.evaluate(X_val, y_val_encoded)
print('Validation cross-entropy Loss:', loss)
print('Validation classification Accuracy:', accuracy)
1875/1875 [==============================] - 2s 1ms/step - loss: 0.2665 - accuracy: 0.9246 Validation cross-entropy Loss: 0.2665331959724426 Validation classification Accuracy: 0.9245666861534119
# Obtain cross-entropy loss and accuracy scores on test dataset
loss, accuracy = model6.evaluate(X_test, y_test_encoded)
print('Test cross-entropy Loss:', loss)
print('Test classification Accuracy:', accuracy)
563/563 [==============================] - 0s 823us/step - loss: 0.4122 - accuracy: 0.8824 Test cross-entropy Loss: 0.412158727645874 Test classification Accuracy: 0.882444441318512
• With only three hidden layers, dropout layers and batch-normalization applied after each hidden layer applied, validation accuracy and test accuracy are ~92% and 88%+ respectively - the best accuracy obtained so far • Again as before, we have consistent outperformance of classifier on validation data vs on training data, attributed due to dropout effect • Notably, validation accuracy is very much stable - i.e., lower the variance in accuracy and hence higher the confidence in validation accuracy to remain • The same is reflected through a descent test accuracy of 88% • Let's play around with optimization algo - try Nadam
def create_model(numLayers=3,
numNeurons=[256,128,64],
activation_fn=[tf.keras.layers.LeakyReLU(alpha=0.3)]*3,
kernel_initializer=[tf.keras.initializers.glorot_uniform(seed=7)]*3,
dropout_rate=[0.20, 0.10, 0.05],
output_activation_fn='softmax',
optimizer=tf.keras.optimizers.Adam(learning_rate=0.0025)):
# Initialize sequential Network
model = tf.keras.Sequential()
# Reshape the input of 32 x 32 image into 1d array with 1024 features
model.add(tf.keras.layers.Reshape(target_shape=(1024,), input_shape=(32,32,)))
# Add batch normalization
model.add(tf.keras.layers.BatchNormalization())
if (numLayers==len(numNeurons)) & (numLayers==len(activation_fn)) & (numLayers==len(kernel_initializer)) & (numLayers==len(dropout_rate)):
# Build hidden layers
for l in range(numLayers):
# Add network layers
model.add(tf.keras.layers.Dense(units=numNeurons[l], activation=activation_fn[l], kernel_initializer=kernel_initializer[l]))
# Add batch normalization
model.add(tf.keras.layers.BatchNormalization())
# Add dropout rates for regularization + avoid overfitting
model.add(tf.keras.layers.Dropout(rate=dropout_rate[l]))
# Output Layer with 10 neurons and softmax activation function
model.add(tf.keras.layers.Dense(units=10, activation=output_activation_fn))
# Compile model
model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy'])
return(model)
# Build NN model and fit on training dataset
model7 = create_model(numLayers=3,
numNeurons=[256,128,64],
activation_fn=[tf.keras.layers.LeakyReLU(alpha=0.3)]*3,
kernel_initializer=[tf.keras.initializers.glorot_uniform(seed=7)]*3,
dropout_rate=[0.20, 0.10, 0.05],
output_activation_fn='softmax',
optimizer=tf.keras.optimizers.Nadam())
# Set early stopping criteria (i.e., no improvement in validation loss in 10 successive epochs)
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10,
restore_best_weights=True, mode='min')
# Train model
hist7 = model7.fit(X_train, y_train_encoded, batch_size=128, epochs=200, verbose=1,
validation_data=(X_val, y_val_encoded), callbacks=[callback])
Epoch 1/200 329/329 [==============================] - 4s 8ms/step - loss: 1.5367 - accuracy: 0.4981 - val_loss: 1.3148 - val_accuracy: 0.5923 Epoch 2/200 329/329 [==============================] - 3s 8ms/step - loss: 1.0877 - accuracy: 0.6627 - val_loss: 1.1801 - val_accuracy: 0.6333 Epoch 3/200 329/329 [==============================] - 2s 7ms/step - loss: 0.9611 - accuracy: 0.7007 - val_loss: 1.1078 - val_accuracy: 0.6609 Epoch 4/200 329/329 [==============================] - 2s 8ms/step - loss: 0.8897 - accuracy: 0.7256 - val_loss: 0.8284 - val_accuracy: 0.7401 Epoch 5/200 329/329 [==============================] - 2s 7ms/step - loss: 0.8335 - accuracy: 0.7422 - val_loss: 0.7220 - val_accuracy: 0.7786 Epoch 6/200 329/329 [==============================] - 2s 8ms/step - loss: 0.8001 - accuracy: 0.7523 - val_loss: 0.7165 - val_accuracy: 0.7788 Epoch 7/200 329/329 [==============================] - 2s 7ms/step - loss: 0.7664 - accuracy: 0.7612 - val_loss: 0.7463 - val_accuracy: 0.7719 Epoch 8/200 329/329 [==============================] - 3s 8ms/step - loss: 0.7434 - accuracy: 0.7702 - val_loss: 0.8968 - val_accuracy: 0.7310 Epoch 9/200 329/329 [==============================] - 3s 9ms/step - loss: 0.7366 - accuracy: 0.7717 - val_loss: 0.7009 - val_accuracy: 0.7853 Epoch 10/200 329/329 [==============================] - 3s 9ms/step - loss: 0.7049 - accuracy: 0.7807 - val_loss: 0.6408 - val_accuracy: 0.8011 Epoch 11/200 329/329 [==============================] - 3s 8ms/step - loss: 0.6881 - accuracy: 0.7879 - val_loss: 0.6093 - val_accuracy: 0.8185 Epoch 12/200 329/329 [==============================] - 3s 8ms/step - loss: 0.6716 - accuracy: 0.7904 - val_loss: 0.5549 - val_accuracy: 0.8317 Epoch 13/200 329/329 [==============================] - 3s 8ms/step - loss: 0.6605 - accuracy: 0.7944 - val_loss: 0.5924 - val_accuracy: 0.8198 Epoch 14/200 329/329 [==============================] - 3s 8ms/step - loss: 0.6548 - accuracy: 0.7971 - val_loss: 0.5732 - val_accuracy: 0.8242 Epoch 15/200 329/329 [==============================] - 3s 8ms/step - loss: 0.6423 - accuracy: 0.8017 - val_loss: 0.7361 - val_accuracy: 0.7780 Epoch 16/200 329/329 [==============================] - 3s 8ms/step - loss: 0.6427 - accuracy: 0.8011 - val_loss: 0.6016 - val_accuracy: 0.8152 Epoch 17/200 329/329 [==============================] - 3s 8ms/step - loss: 0.6273 - accuracy: 0.8047 - val_loss: 0.4914 - val_accuracy: 0.8511 Epoch 18/200 329/329 [==============================] - 3s 8ms/step - loss: 0.6164 - accuracy: 0.8080 - val_loss: 0.6186 - val_accuracy: 0.8117 Epoch 19/200 329/329 [==============================] - 3s 8ms/step - loss: 0.6140 - accuracy: 0.8096 - val_loss: 0.5056 - val_accuracy: 0.8469 Epoch 20/200 329/329 [==============================] - 2s 7ms/step - loss: 0.6003 - accuracy: 0.8143 - val_loss: 0.4555 - val_accuracy: 0.8634 Epoch 21/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5922 - accuracy: 0.8146 - val_loss: 0.4899 - val_accuracy: 0.8494 Epoch 22/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5992 - accuracy: 0.8139 - val_loss: 0.4609 - val_accuracy: 0.8619 Epoch 23/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5912 - accuracy: 0.8156 - val_loss: 0.5301 - val_accuracy: 0.8432 Epoch 24/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5856 - accuracy: 0.8176 - val_loss: 0.4804 - val_accuracy: 0.8551 Epoch 25/200 329/329 [==============================] - 2s 8ms/step - loss: 0.5725 - accuracy: 0.8212 - val_loss: 0.4495 - val_accuracy: 0.8641 Epoch 26/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5762 - accuracy: 0.8192 - val_loss: 0.4239 - val_accuracy: 0.8746 Epoch 27/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5681 - accuracy: 0.8230 - val_loss: 0.4577 - val_accuracy: 0.8616 Epoch 28/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5712 - accuracy: 0.8202 - val_loss: 0.4445 - val_accuracy: 0.8648 Epoch 29/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5578 - accuracy: 0.8266 - val_loss: 0.6780 - val_accuracy: 0.7989 Epoch 30/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5733 - accuracy: 0.8199 - val_loss: 0.4344 - val_accuracy: 0.8694 Epoch 31/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5511 - accuracy: 0.8275 - val_loss: 0.4064 - val_accuracy: 0.8809 Epoch 32/200 329/329 [==============================] - 2s 8ms/step - loss: 0.5525 - accuracy: 0.8268 - val_loss: 0.4296 - val_accuracy: 0.8696 Epoch 33/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5426 - accuracy: 0.8302 - val_loss: 0.4147 - val_accuracy: 0.8768 Epoch 34/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5505 - accuracy: 0.8279 - val_loss: 0.4236 - val_accuracy: 0.8727 Epoch 35/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5393 - accuracy: 0.8295 - val_loss: 0.4000 - val_accuracy: 0.8812 Epoch 36/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5343 - accuracy: 0.8325 - val_loss: 0.3926 - val_accuracy: 0.8835 Epoch 37/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5295 - accuracy: 0.8341 - val_loss: 0.3851 - val_accuracy: 0.8874 Epoch 38/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5318 - accuracy: 0.8339 - val_loss: 0.3933 - val_accuracy: 0.8835 Epoch 39/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5313 - accuracy: 0.8333 - val_loss: 0.4052 - val_accuracy: 0.8795 Epoch 40/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5216 - accuracy: 0.8380 - val_loss: 0.4141 - val_accuracy: 0.8762 Epoch 41/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5238 - accuracy: 0.8358 - val_loss: 0.3713 - val_accuracy: 0.8910 Epoch 42/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5219 - accuracy: 0.8357 - val_loss: 0.5720 - val_accuracy: 0.8343 Epoch 43/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5357 - accuracy: 0.8325 - val_loss: 0.4264 - val_accuracy: 0.8736 Epoch 44/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5265 - accuracy: 0.8338 - val_loss: 0.3726 - val_accuracy: 0.8909 Epoch 45/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5105 - accuracy: 0.8382 - val_loss: 0.3737 - val_accuracy: 0.8904 Epoch 46/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5135 - accuracy: 0.8380 - val_loss: 0.4066 - val_accuracy: 0.8780 Epoch 47/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5215 - accuracy: 0.8343 - val_loss: 0.3941 - val_accuracy: 0.8824 Epoch 48/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5134 - accuracy: 0.8402 - val_loss: 0.3748 - val_accuracy: 0.8891 Epoch 49/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5054 - accuracy: 0.8406 - val_loss: 0.3923 - val_accuracy: 0.8842 Epoch 50/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5086 - accuracy: 0.8415 - val_loss: 0.3602 - val_accuracy: 0.8953 Epoch 51/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5028 - accuracy: 0.8423 - val_loss: 0.3630 - val_accuracy: 0.8924 Epoch 52/200 329/329 [==============================] - 2s 8ms/step - loss: 0.5083 - accuracy: 0.8406 - val_loss: 0.3553 - val_accuracy: 0.8975 Epoch 53/200 329/329 [==============================] - 3s 8ms/step - loss: 0.5013 - accuracy: 0.8424 - val_loss: 0.3775 - val_accuracy: 0.8874 Epoch 54/200 329/329 [==============================] - 2s 8ms/step - loss: 0.4996 - accuracy: 0.8429 - val_loss: 0.3591 - val_accuracy: 0.8946 Epoch 55/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5003 - accuracy: 0.8424 - val_loss: 0.3502 - val_accuracy: 0.8976 Epoch 56/200 329/329 [==============================] - 2s 8ms/step - loss: 0.4998 - accuracy: 0.8425 - val_loss: 0.3736 - val_accuracy: 0.8898 Epoch 57/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4998 - accuracy: 0.8425 - val_loss: 0.3721 - val_accuracy: 0.8902 Epoch 58/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4887 - accuracy: 0.8465 - val_loss: 0.3493 - val_accuracy: 0.8960 Epoch 59/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4926 - accuracy: 0.8460 - val_loss: 0.3478 - val_accuracy: 0.8979 Epoch 60/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4854 - accuracy: 0.8476 - val_loss: 0.3405 - val_accuracy: 0.9011 Epoch 61/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4898 - accuracy: 0.8454 - val_loss: 0.3469 - val_accuracy: 0.8999 Epoch 62/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4831 - accuracy: 0.8476 - val_loss: 0.3513 - val_accuracy: 0.8967 Epoch 63/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4835 - accuracy: 0.8484 - val_loss: 0.3571 - val_accuracy: 0.8939 Epoch 64/200 329/329 [==============================] - 2s 8ms/step - loss: 0.4871 - accuracy: 0.8457 - val_loss: 0.3402 - val_accuracy: 0.8994 Epoch 65/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4755 - accuracy: 0.8493 - val_loss: 0.3401 - val_accuracy: 0.9010 Epoch 66/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4771 - accuracy: 0.8504 - val_loss: 0.3480 - val_accuracy: 0.8981 Epoch 67/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4814 - accuracy: 0.8489 - val_loss: 0.3550 - val_accuracy: 0.8963 Epoch 68/200 329/329 [==============================] - 2s 8ms/step - loss: 0.4741 - accuracy: 0.8500 - val_loss: 0.3345 - val_accuracy: 0.9023 Epoch 69/200 329/329 [==============================] - 2s 8ms/step - loss: 0.4799 - accuracy: 0.8471 - val_loss: 0.3425 - val_accuracy: 0.8981 Epoch 70/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4739 - accuracy: 0.8479 - val_loss: 0.3378 - val_accuracy: 0.9014 Epoch 71/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4806 - accuracy: 0.8492 - val_loss: 0.3710 - val_accuracy: 0.8895 Epoch 72/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4827 - accuracy: 0.8473 - val_loss: 0.3433 - val_accuracy: 0.8999 Epoch 73/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4719 - accuracy: 0.8508 - val_loss: 0.3479 - val_accuracy: 0.8977 Epoch 74/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4740 - accuracy: 0.8493 - val_loss: 0.3355 - val_accuracy: 0.9019 Epoch 75/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4730 - accuracy: 0.8509 - val_loss: 0.3417 - val_accuracy: 0.9002 Epoch 76/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4694 - accuracy: 0.8515 - val_loss: 0.3298 - val_accuracy: 0.9053 Epoch 77/200 329/329 [==============================] - 2s 8ms/step - loss: 0.4626 - accuracy: 0.8545 - val_loss: 0.3770 - val_accuracy: 0.8893 Epoch 78/200 329/329 [==============================] - 2s 8ms/step - loss: 0.4703 - accuracy: 0.8499 - val_loss: 0.3497 - val_accuracy: 0.8965 Epoch 79/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4677 - accuracy: 0.8529 - val_loss: 0.3468 - val_accuracy: 0.8976 Epoch 80/200 329/329 [==============================] - 2s 8ms/step - loss: 0.4627 - accuracy: 0.8535 - val_loss: 0.3289 - val_accuracy: 0.9042 Epoch 81/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4645 - accuracy: 0.8518 - val_loss: 0.3221 - val_accuracy: 0.9064 Epoch 82/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4640 - accuracy: 0.8529 - val_loss: 0.3338 - val_accuracy: 0.9028 Epoch 83/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4604 - accuracy: 0.8540 - val_loss: 0.3218 - val_accuracy: 0.9049 Epoch 84/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4661 - accuracy: 0.8527 - val_loss: 0.3204 - val_accuracy: 0.9084 Epoch 85/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4516 - accuracy: 0.8572 - val_loss: 0.3337 - val_accuracy: 0.9021 Epoch 86/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4631 - accuracy: 0.8549 - val_loss: 0.3351 - val_accuracy: 0.9016 Epoch 87/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4617 - accuracy: 0.8543 - val_loss: 0.3363 - val_accuracy: 0.9026 Epoch 88/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4544 - accuracy: 0.8544 - val_loss: 0.3392 - val_accuracy: 0.9007 Epoch 89/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4661 - accuracy: 0.8530 - val_loss: 0.3218 - val_accuracy: 0.9074 Epoch 90/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4557 - accuracy: 0.8563 - val_loss: 0.3270 - val_accuracy: 0.9055 Epoch 91/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4523 - accuracy: 0.8581 - val_loss: 0.3303 - val_accuracy: 0.9046 Epoch 92/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4593 - accuracy: 0.8542 - val_loss: 0.3246 - val_accuracy: 0.9050 Epoch 93/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4555 - accuracy: 0.8555 - val_loss: 0.3149 - val_accuracy: 0.9099 Epoch 94/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4484 - accuracy: 0.8585 - val_loss: 0.3184 - val_accuracy: 0.9076 Epoch 95/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4504 - accuracy: 0.8604 - val_loss: 0.3159 - val_accuracy: 0.9075 Epoch 96/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4491 - accuracy: 0.8578 - val_loss: 0.3142 - val_accuracy: 0.9109 Epoch 97/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4540 - accuracy: 0.8556 - val_loss: 0.3107 - val_accuracy: 0.9109 Epoch 98/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4436 - accuracy: 0.8592 - val_loss: 0.3121 - val_accuracy: 0.9100 Epoch 99/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4584 - accuracy: 0.8539 - val_loss: 0.3120 - val_accuracy: 0.9114 Epoch 100/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4441 - accuracy: 0.8569 - val_loss: 0.3048 - val_accuracy: 0.9127 Epoch 101/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4447 - accuracy: 0.8567 - val_loss: 0.3642 - val_accuracy: 0.8911 Epoch 102/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4548 - accuracy: 0.8562 - val_loss: 0.3080 - val_accuracy: 0.9117 Epoch 103/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4490 - accuracy: 0.8583 - val_loss: 0.3080 - val_accuracy: 0.9110 Epoch 104/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4481 - accuracy: 0.8576 - val_loss: 0.3039 - val_accuracy: 0.9114 Epoch 105/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4405 - accuracy: 0.8592 - val_loss: 0.3108 - val_accuracy: 0.9105 Epoch 106/200 329/329 [==============================] - 2s 8ms/step - loss: 0.4466 - accuracy: 0.8581 - val_loss: 0.3099 - val_accuracy: 0.9104 Epoch 107/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4399 - accuracy: 0.8592 - val_loss: 0.2992 - val_accuracy: 0.9144 Epoch 108/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4422 - accuracy: 0.8590 - val_loss: 0.3041 - val_accuracy: 0.9118 Epoch 109/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4397 - accuracy: 0.8598 - val_loss: 0.3072 - val_accuracy: 0.9119 Epoch 110/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4375 - accuracy: 0.8600 - val_loss: 0.3040 - val_accuracy: 0.9129 Epoch 111/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4384 - accuracy: 0.8594 - val_loss: 0.3151 - val_accuracy: 0.9078 Epoch 112/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4373 - accuracy: 0.8603 - val_loss: 0.3091 - val_accuracy: 0.9105 Epoch 113/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4304 - accuracy: 0.8636 - val_loss: 0.3127 - val_accuracy: 0.9091 Epoch 114/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4440 - accuracy: 0.8582 - val_loss: 0.3131 - val_accuracy: 0.9095 Epoch 115/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4336 - accuracy: 0.8621 - val_loss: 0.2981 - val_accuracy: 0.9149 Epoch 116/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4342 - accuracy: 0.8623 - val_loss: 0.2964 - val_accuracy: 0.9151 Epoch 117/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4322 - accuracy: 0.8633 - val_loss: 0.3154 - val_accuracy: 0.9087 Epoch 118/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4366 - accuracy: 0.8615 - val_loss: 0.3111 - val_accuracy: 0.9097 Epoch 119/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4353 - accuracy: 0.8619 - val_loss: 0.2982 - val_accuracy: 0.9143 Epoch 120/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4318 - accuracy: 0.8631 - val_loss: 0.3028 - val_accuracy: 0.9137 Epoch 121/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4325 - accuracy: 0.8627 - val_loss: 0.3240 - val_accuracy: 0.9047 Epoch 122/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4348 - accuracy: 0.8614 - val_loss: 0.3023 - val_accuracy: 0.9144 Epoch 123/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4236 - accuracy: 0.8638 - val_loss: 0.2946 - val_accuracy: 0.9164 Epoch 124/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4294 - accuracy: 0.8628 - val_loss: 0.3175 - val_accuracy: 0.9075 Epoch 125/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4256 - accuracy: 0.8648 - val_loss: 0.3065 - val_accuracy: 0.9114 Epoch 126/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4312 - accuracy: 0.8630 - val_loss: 0.3222 - val_accuracy: 0.9064 Epoch 127/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4364 - accuracy: 0.8619 - val_loss: 0.2869 - val_accuracy: 0.9189 Epoch 128/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4267 - accuracy: 0.8636 - val_loss: 0.2919 - val_accuracy: 0.9170 Epoch 129/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4268 - accuracy: 0.8659 - val_loss: 0.2963 - val_accuracy: 0.9147 Epoch 130/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4235 - accuracy: 0.8661 - val_loss: 0.3021 - val_accuracy: 0.9142 Epoch 131/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4218 - accuracy: 0.8658 - val_loss: 0.2972 - val_accuracy: 0.9141 Epoch 132/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4250 - accuracy: 0.8636 - val_loss: 0.2918 - val_accuracy: 0.9162 Epoch 133/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4211 - accuracy: 0.8654 - val_loss: 0.2871 - val_accuracy: 0.9177 Epoch 134/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4156 - accuracy: 0.8679 - val_loss: 0.2913 - val_accuracy: 0.9179 Epoch 135/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4234 - accuracy: 0.8652 - val_loss: 0.2899 - val_accuracy: 0.9182 Epoch 136/200 329/329 [==============================] - 3s 9ms/step - loss: 0.4216 - accuracy: 0.8668 - val_loss: 0.3008 - val_accuracy: 0.9133 Epoch 137/200 329/329 [==============================] - 3s 8ms/step - loss: 0.4231 - accuracy: 0.8638 - val_loss: 0.2912 - val_accuracy: 0.9161
We can plot the training loss, validation loss vs number of epochs and training accuracy, validation accuracy vs number of epochs using Matplotlib.
plt.figure(figsize=(16,5))
# History of accuracy score
plt.subplot(1, 2, 1)
plt.plot(hist7.history['accuracy'])
plt.plot(hist7.history['val_accuracy'])
plt.title('Evolution of model accuracy score by epochs')
plt.ylabel('accuracy_score')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='lower right')
#---------------------------------
# History of Loss
plt.subplot(1, 2, 2)
plt.plot(hist7.history['loss'])
plt.plot(hist7.history['val_loss'])
plt.title('Evolution of Loss (Categorical Cross-Entropy) by epochs')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='upper right')
plt.tight_layout()
To evaluate the performance of the model, we will use the evaluate() function, which returns the loss value and metrics values for the model on the test data.
# Obtain cross-entropy loss and accuracy scores on validation dataset
loss, accuracy = model7.evaluate(X_val, y_val_encoded)
print('Validation cross-entropy Loss:', loss)
print('Validation classification Accuracy:', accuracy)
1875/1875 [==============================] - 2s 1ms/step - loss: 0.2869 - accuracy: 0.9189 Validation cross-entropy Loss: 0.286933958530426 Validation classification Accuracy: 0.9189000129699707
# Obtain cross-entropy loss and accuracy scores on test dataset
loss, accuracy = model7.evaluate(X_test, y_test_encoded)
print('Test cross-entropy Loss:', loss)
print('Test classification Accuracy:', accuracy)
563/563 [==============================] - 0s 827us/step - loss: 0.4241 - accuracy: 0.8799 Test cross-entropy Loss: 0.4240877330303192 Test classification Accuracy: 0.8799444437026978
• Validation and Test accuracy scores are ~91% and 87%+ respectively • Adam optimizer still does touch better vs Nadam • Since images have noise, we may also want to consider SGD with Nesterov momentum
# Build NN model and fit on training dataset
model8 = create_model(numLayers=3,
numNeurons=[256,128,64],
activation_fn=[tf.keras.layers.LeakyReLU(alpha=0.3)]*3,
kernel_initializer=[tf.keras.initializers.glorot_uniform(seed=7)]*3,
dropout_rate=[0.20, 0.10, 0.05],
output_activation_fn='softmax',
optimizer=tf.keras.optimizers.SGD(learning_rate=0.005,
momentum=0.9, nesterov=True))
# Set early stopping criteria (i.e., no improvement in validation loss in 10 successive epochs)
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10,
restore_best_weights=True, mode='min')
# Train model
hist8 = model8.fit(X_train, y_train_encoded, batch_size=128, epochs=200, verbose=1,
validation_data=(X_val, y_val_encoded), callbacks=[callback])
Epoch 1/200 329/329 [==============================] - 3s 6ms/step - loss: 1.7938 - accuracy: 0.3934 - val_loss: 1.2972 - val_accuracy: 0.6460 Epoch 2/200 329/329 [==============================] - 2s 6ms/step - loss: 1.2233 - accuracy: 0.6093 - val_loss: 0.9701 - val_accuracy: 0.7039 Epoch 3/200 329/329 [==============================] - 2s 6ms/step - loss: 1.0764 - accuracy: 0.6617 - val_loss: 0.8263 - val_accuracy: 0.7520 Epoch 4/200 329/329 [==============================] - 2s 6ms/step - loss: 0.9946 - accuracy: 0.6887 - val_loss: 0.7832 - val_accuracy: 0.7665 Epoch 5/200 329/329 [==============================] - 2s 6ms/step - loss: 0.9378 - accuracy: 0.7082 - val_loss: 0.7545 - val_accuracy: 0.7733 Epoch 6/200 329/329 [==============================] - 2s 6ms/step - loss: 0.8997 - accuracy: 0.7196 - val_loss: 0.7062 - val_accuracy: 0.7891 Epoch 7/200 329/329 [==============================] - 2s 6ms/step - loss: 0.8525 - accuracy: 0.7347 - val_loss: 0.6799 - val_accuracy: 0.7964 Epoch 8/200 329/329 [==============================] - 2s 6ms/step - loss: 0.8266 - accuracy: 0.7441 - val_loss: 0.6304 - val_accuracy: 0.8155 Epoch 9/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7974 - accuracy: 0.7525 - val_loss: 0.6195 - val_accuracy: 0.8172 Epoch 10/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7752 - accuracy: 0.7606 - val_loss: 0.5970 - val_accuracy: 0.8246 Epoch 11/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7651 - accuracy: 0.7624 - val_loss: 0.5798 - val_accuracy: 0.8307 Epoch 12/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7414 - accuracy: 0.7705 - val_loss: 0.5653 - val_accuracy: 0.8347 Epoch 13/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7315 - accuracy: 0.7730 - val_loss: 0.5421 - val_accuracy: 0.8436 Epoch 14/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7215 - accuracy: 0.7748 - val_loss: 0.5347 - val_accuracy: 0.8436 Epoch 15/200 329/329 [==============================] - 2s 6ms/step - loss: 0.7040 - accuracy: 0.7833 - val_loss: 0.5295 - val_accuracy: 0.8444 Epoch 16/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6997 - accuracy: 0.7830 - val_loss: 0.5388 - val_accuracy: 0.8397 Epoch 17/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6882 - accuracy: 0.7875 - val_loss: 0.5052 - val_accuracy: 0.8534 Epoch 18/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6769 - accuracy: 0.7918 - val_loss: 0.5043 - val_accuracy: 0.8525 Epoch 19/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6625 - accuracy: 0.7954 - val_loss: 0.5000 - val_accuracy: 0.8536 Epoch 20/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6601 - accuracy: 0.7931 - val_loss: 0.4809 - val_accuracy: 0.8610 Epoch 21/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6534 - accuracy: 0.7967 - val_loss: 0.4894 - val_accuracy: 0.8568 Epoch 22/200 329/329 [==============================] - 2s 7ms/step - loss: 0.6499 - accuracy: 0.7989 - val_loss: 0.4750 - val_accuracy: 0.8626 Epoch 23/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6431 - accuracy: 0.7995 - val_loss: 0.4682 - val_accuracy: 0.8637 Epoch 24/200 329/329 [==============================] - 2s 8ms/step - loss: 0.6328 - accuracy: 0.8034 - val_loss: 0.4676 - val_accuracy: 0.8643 Epoch 25/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6271 - accuracy: 0.8040 - val_loss: 0.4590 - val_accuracy: 0.8664 Epoch 26/200 329/329 [==============================] - 2s 7ms/step - loss: 0.6270 - accuracy: 0.8050 - val_loss: 0.4730 - val_accuracy: 0.8615 Epoch 27/200 329/329 [==============================] - 2s 8ms/step - loss: 0.6248 - accuracy: 0.8070 - val_loss: 0.4452 - val_accuracy: 0.8705 Epoch 28/200 329/329 [==============================] - 2s 7ms/step - loss: 0.6118 - accuracy: 0.8089 - val_loss: 0.4697 - val_accuracy: 0.8605 Epoch 29/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6144 - accuracy: 0.8079 - val_loss: 0.4574 - val_accuracy: 0.8643 Epoch 30/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6108 - accuracy: 0.8096 - val_loss: 0.4382 - val_accuracy: 0.8731 Epoch 31/200 329/329 [==============================] - 2s 6ms/step - loss: 0.6015 - accuracy: 0.8142 - val_loss: 0.4287 - val_accuracy: 0.8758 Epoch 32/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5919 - accuracy: 0.8168 - val_loss: 0.4365 - val_accuracy: 0.8737 Epoch 33/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5949 - accuracy: 0.8133 - val_loss: 0.4392 - val_accuracy: 0.8698 Epoch 34/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5925 - accuracy: 0.8158 - val_loss: 0.4544 - val_accuracy: 0.8667 Epoch 35/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5954 - accuracy: 0.8135 - val_loss: 0.4183 - val_accuracy: 0.8790 Epoch 36/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5799 - accuracy: 0.8180 - val_loss: 0.4205 - val_accuracy: 0.8773 Epoch 37/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5758 - accuracy: 0.8208 - val_loss: 0.4132 - val_accuracy: 0.8806 Epoch 38/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5710 - accuracy: 0.8232 - val_loss: 0.4086 - val_accuracy: 0.8813 Epoch 39/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5728 - accuracy: 0.8206 - val_loss: 0.4167 - val_accuracy: 0.8791 Epoch 40/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5625 - accuracy: 0.8237 - val_loss: 0.4027 - val_accuracy: 0.8837 Epoch 41/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5601 - accuracy: 0.8220 - val_loss: 0.4051 - val_accuracy: 0.8815 Epoch 42/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5629 - accuracy: 0.8233 - val_loss: 0.3999 - val_accuracy: 0.8841 Epoch 43/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5602 - accuracy: 0.8240 - val_loss: 0.4046 - val_accuracy: 0.8819 Epoch 44/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5536 - accuracy: 0.8303 - val_loss: 0.4015 - val_accuracy: 0.8811 Epoch 45/200 329/329 [==============================] - 2s 7ms/step - loss: 0.5598 - accuracy: 0.8255 - val_loss: 0.3969 - val_accuracy: 0.8842 Epoch 46/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5505 - accuracy: 0.8269 - val_loss: 0.4106 - val_accuracy: 0.8798 Epoch 47/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5477 - accuracy: 0.8281 - val_loss: 0.3932 - val_accuracy: 0.8851 Epoch 48/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5422 - accuracy: 0.8314 - val_loss: 0.4078 - val_accuracy: 0.8796 Epoch 49/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5503 - accuracy: 0.8245 - val_loss: 0.3870 - val_accuracy: 0.8872 Epoch 50/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5459 - accuracy: 0.8296 - val_loss: 0.4207 - val_accuracy: 0.8747 Epoch 51/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5432 - accuracy: 0.8298 - val_loss: 0.3820 - val_accuracy: 0.8898 Epoch 52/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5360 - accuracy: 0.8329 - val_loss: 0.3824 - val_accuracy: 0.8890 Epoch 53/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5333 - accuracy: 0.8340 - val_loss: 0.3824 - val_accuracy: 0.8886 Epoch 54/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5302 - accuracy: 0.8345 - val_loss: 0.3976 - val_accuracy: 0.8833 Epoch 55/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5341 - accuracy: 0.8322 - val_loss: 0.3853 - val_accuracy: 0.8879 Epoch 56/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5277 - accuracy: 0.8332 - val_loss: 0.3698 - val_accuracy: 0.8935 Epoch 57/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5194 - accuracy: 0.8375 - val_loss: 0.3710 - val_accuracy: 0.8932 Epoch 58/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5236 - accuracy: 0.8360 - val_loss: 0.3660 - val_accuracy: 0.8931 Epoch 59/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5195 - accuracy: 0.8380 - val_loss: 0.3811 - val_accuracy: 0.8871 Epoch 60/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5181 - accuracy: 0.8376 - val_loss: 0.3640 - val_accuracy: 0.8951 Epoch 61/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5181 - accuracy: 0.8372 - val_loss: 0.3823 - val_accuracy: 0.8879 Epoch 62/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5141 - accuracy: 0.8389 - val_loss: 0.3663 - val_accuracy: 0.8942 Epoch 63/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5140 - accuracy: 0.8390 - val_loss: 0.3630 - val_accuracy: 0.8949 Epoch 64/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5106 - accuracy: 0.8400 - val_loss: 0.3600 - val_accuracy: 0.8965 Epoch 65/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5143 - accuracy: 0.8378 - val_loss: 0.3604 - val_accuracy: 0.8959 Epoch 66/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5090 - accuracy: 0.8395 - val_loss: 0.3615 - val_accuracy: 0.8949 Epoch 67/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5102 - accuracy: 0.8389 - val_loss: 0.3594 - val_accuracy: 0.8947 Epoch 68/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5071 - accuracy: 0.8395 - val_loss: 0.3640 - val_accuracy: 0.8939 Epoch 69/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5047 - accuracy: 0.8406 - val_loss: 0.3663 - val_accuracy: 0.8932 Epoch 70/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5100 - accuracy: 0.8406 - val_loss: 0.3658 - val_accuracy: 0.8932 Epoch 71/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5002 - accuracy: 0.8425 - val_loss: 0.3460 - val_accuracy: 0.9010 Epoch 72/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4980 - accuracy: 0.8430 - val_loss: 0.3691 - val_accuracy: 0.8910 Epoch 73/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4997 - accuracy: 0.8417 - val_loss: 0.3711 - val_accuracy: 0.8921 Epoch 74/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5055 - accuracy: 0.8405 - val_loss: 0.3635 - val_accuracy: 0.8940 Epoch 75/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4952 - accuracy: 0.8458 - val_loss: 0.3439 - val_accuracy: 0.9009 Epoch 76/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4937 - accuracy: 0.8458 - val_loss: 0.3423 - val_accuracy: 0.9011 Epoch 77/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4900 - accuracy: 0.8452 - val_loss: 0.3858 - val_accuracy: 0.8857 Epoch 78/200 329/329 [==============================] - 2s 6ms/step - loss: 0.5008 - accuracy: 0.8434 - val_loss: 0.3493 - val_accuracy: 0.8994 Epoch 79/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4927 - accuracy: 0.8452 - val_loss: 0.3506 - val_accuracy: 0.8984 Epoch 80/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4934 - accuracy: 0.8448 - val_loss: 0.3495 - val_accuracy: 0.8983 Epoch 81/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4829 - accuracy: 0.8470 - val_loss: 0.3460 - val_accuracy: 0.8992 Epoch 82/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4881 - accuracy: 0.8468 - val_loss: 0.3391 - val_accuracy: 0.9021 Epoch 83/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4816 - accuracy: 0.8474 - val_loss: 0.3464 - val_accuracy: 0.9009 Epoch 84/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4886 - accuracy: 0.8481 - val_loss: 0.3483 - val_accuracy: 0.9004 Epoch 85/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4811 - accuracy: 0.8478 - val_loss: 0.3385 - val_accuracy: 0.9026 Epoch 86/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4820 - accuracy: 0.8480 - val_loss: 0.3353 - val_accuracy: 0.9031 Epoch 87/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4786 - accuracy: 0.8460 - val_loss: 0.3455 - val_accuracy: 0.9000 Epoch 88/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4816 - accuracy: 0.8475 - val_loss: 0.3303 - val_accuracy: 0.9046 Epoch 89/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4784 - accuracy: 0.8499 - val_loss: 0.3306 - val_accuracy: 0.9061 Epoch 90/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4758 - accuracy: 0.8507 - val_loss: 0.3379 - val_accuracy: 0.9025 Epoch 91/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4746 - accuracy: 0.8511 - val_loss: 0.3326 - val_accuracy: 0.9045 Epoch 92/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4756 - accuracy: 0.8496 - val_loss: 0.3424 - val_accuracy: 0.9018 Epoch 93/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4807 - accuracy: 0.8485 - val_loss: 0.3293 - val_accuracy: 0.9049 Epoch 94/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4751 - accuracy: 0.8507 - val_loss: 0.3241 - val_accuracy: 0.9068 Epoch 95/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4720 - accuracy: 0.8502 - val_loss: 0.3397 - val_accuracy: 0.9016 Epoch 96/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4695 - accuracy: 0.8525 - val_loss: 0.3310 - val_accuracy: 0.9051 Epoch 97/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4649 - accuracy: 0.8536 - val_loss: 0.3392 - val_accuracy: 0.9027 Epoch 98/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4672 - accuracy: 0.8532 - val_loss: 0.3317 - val_accuracy: 0.9038 Epoch 99/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4744 - accuracy: 0.8504 - val_loss: 0.3296 - val_accuracy: 0.9047 Epoch 100/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4700 - accuracy: 0.8513 - val_loss: 0.3198 - val_accuracy: 0.9096 Epoch 101/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4634 - accuracy: 0.8520 - val_loss: 0.3283 - val_accuracy: 0.9055 Epoch 102/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4635 - accuracy: 0.8520 - val_loss: 0.3196 - val_accuracy: 0.9086 Epoch 103/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4651 - accuracy: 0.8547 - val_loss: 0.3205 - val_accuracy: 0.9090 Epoch 104/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4636 - accuracy: 0.8533 - val_loss: 0.3267 - val_accuracy: 0.9058 Epoch 105/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4639 - accuracy: 0.8528 - val_loss: 0.3151 - val_accuracy: 0.9105 Epoch 106/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4593 - accuracy: 0.8539 - val_loss: 0.3257 - val_accuracy: 0.9061 Epoch 107/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4683 - accuracy: 0.8518 - val_loss: 0.3204 - val_accuracy: 0.9088 Epoch 108/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4577 - accuracy: 0.8548 - val_loss: 0.3378 - val_accuracy: 0.9016 Epoch 109/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4680 - accuracy: 0.8519 - val_loss: 0.3322 - val_accuracy: 0.9041 Epoch 110/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4620 - accuracy: 0.8548 - val_loss: 0.3172 - val_accuracy: 0.9094 Epoch 111/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4538 - accuracy: 0.8554 - val_loss: 0.3304 - val_accuracy: 0.9034 Epoch 112/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4596 - accuracy: 0.8525 - val_loss: 0.3123 - val_accuracy: 0.9115 Epoch 113/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4572 - accuracy: 0.8550 - val_loss: 0.3170 - val_accuracy: 0.9084 Epoch 114/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4551 - accuracy: 0.8562 - val_loss: 0.3109 - val_accuracy: 0.9104 Epoch 115/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4493 - accuracy: 0.8560 - val_loss: 0.3323 - val_accuracy: 0.9036 Epoch 116/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4532 - accuracy: 0.8561 - val_loss: 0.3190 - val_accuracy: 0.9083 Epoch 117/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4533 - accuracy: 0.8553 - val_loss: 0.3280 - val_accuracy: 0.9050 Epoch 118/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4551 - accuracy: 0.8554 - val_loss: 0.3104 - val_accuracy: 0.9124 Epoch 119/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4471 - accuracy: 0.8593 - val_loss: 0.3158 - val_accuracy: 0.9093 Epoch 120/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4544 - accuracy: 0.8552 - val_loss: 0.3198 - val_accuracy: 0.9074 Epoch 121/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4530 - accuracy: 0.8569 - val_loss: 0.3156 - val_accuracy: 0.9088 Epoch 122/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4402 - accuracy: 0.8603 - val_loss: 0.3114 - val_accuracy: 0.9106 Epoch 123/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4496 - accuracy: 0.8581 - val_loss: 0.3055 - val_accuracy: 0.9135 Epoch 124/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4469 - accuracy: 0.8582 - val_loss: 0.3017 - val_accuracy: 0.9151 Epoch 125/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4474 - accuracy: 0.8595 - val_loss: 0.3009 - val_accuracy: 0.9151 Epoch 126/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4470 - accuracy: 0.8596 - val_loss: 0.3055 - val_accuracy: 0.9139 Epoch 127/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4500 - accuracy: 0.8575 - val_loss: 0.3123 - val_accuracy: 0.9118 Epoch 128/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4471 - accuracy: 0.8565 - val_loss: 0.3118 - val_accuracy: 0.9105 Epoch 129/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4472 - accuracy: 0.8566 - val_loss: 0.3021 - val_accuracy: 0.9145 Epoch 130/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4445 - accuracy: 0.8592 - val_loss: 0.3029 - val_accuracy: 0.9148 Epoch 131/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4436 - accuracy: 0.8595 - val_loss: 0.2974 - val_accuracy: 0.9156 Epoch 132/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4382 - accuracy: 0.8619 - val_loss: 0.2960 - val_accuracy: 0.9167 Epoch 133/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4355 - accuracy: 0.8601 - val_loss: 0.3054 - val_accuracy: 0.9124 Epoch 134/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4484 - accuracy: 0.8568 - val_loss: 0.3290 - val_accuracy: 0.9042 Epoch 135/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4451 - accuracy: 0.8590 - val_loss: 0.3183 - val_accuracy: 0.9081 Epoch 136/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4405 - accuracy: 0.8604 - val_loss: 0.3064 - val_accuracy: 0.9126 Epoch 137/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4453 - accuracy: 0.8581 - val_loss: 0.3032 - val_accuracy: 0.9144 Epoch 138/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4366 - accuracy: 0.8624 - val_loss: 0.2978 - val_accuracy: 0.9160 Epoch 139/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4432 - accuracy: 0.8576 - val_loss: 0.2970 - val_accuracy: 0.9158 Epoch 140/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4437 - accuracy: 0.8578 - val_loss: 0.2946 - val_accuracy: 0.9169 Epoch 141/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4338 - accuracy: 0.8611 - val_loss: 0.3180 - val_accuracy: 0.9087 Epoch 142/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4411 - accuracy: 0.8604 - val_loss: 0.3096 - val_accuracy: 0.9101 Epoch 143/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4304 - accuracy: 0.8630 - val_loss: 0.2999 - val_accuracy: 0.9151 Epoch 144/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4308 - accuracy: 0.8629 - val_loss: 0.3075 - val_accuracy: 0.9129 Epoch 145/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4296 - accuracy: 0.8636 - val_loss: 0.2991 - val_accuracy: 0.9154 Epoch 146/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4263 - accuracy: 0.8656 - val_loss: 0.2930 - val_accuracy: 0.9178 Epoch 147/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4267 - accuracy: 0.8650 - val_loss: 0.2982 - val_accuracy: 0.9157 Epoch 148/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4315 - accuracy: 0.8622 - val_loss: 0.2892 - val_accuracy: 0.9185 Epoch 149/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4230 - accuracy: 0.8659 - val_loss: 0.2942 - val_accuracy: 0.9174 Epoch 150/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4306 - accuracy: 0.8637 - val_loss: 0.2967 - val_accuracy: 0.9152 Epoch 151/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4296 - accuracy: 0.8651 - val_loss: 0.2932 - val_accuracy: 0.9171 Epoch 152/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4249 - accuracy: 0.8657 - val_loss: 0.2926 - val_accuracy: 0.9172 Epoch 153/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4265 - accuracy: 0.8639 - val_loss: 0.2859 - val_accuracy: 0.9213 Epoch 154/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4261 - accuracy: 0.8662 - val_loss: 0.2933 - val_accuracy: 0.9168 Epoch 155/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4237 - accuracy: 0.8664 - val_loss: 0.3188 - val_accuracy: 0.9081 Epoch 156/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4361 - accuracy: 0.8613 - val_loss: 0.2922 - val_accuracy: 0.9186 Epoch 157/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4254 - accuracy: 0.8658 - val_loss: 0.2882 - val_accuracy: 0.9200 Epoch 158/200 329/329 [==============================] - 2s 7ms/step - loss: 0.4271 - accuracy: 0.8629 - val_loss: 0.3043 - val_accuracy: 0.9145 Epoch 159/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4349 - accuracy: 0.8595 - val_loss: 0.2948 - val_accuracy: 0.9178 Epoch 160/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4287 - accuracy: 0.8646 - val_loss: 0.3049 - val_accuracy: 0.9124 Epoch 161/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4306 - accuracy: 0.8606 - val_loss: 0.2884 - val_accuracy: 0.9190 Epoch 162/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4266 - accuracy: 0.8637 - val_loss: 0.2866 - val_accuracy: 0.9207 Epoch 163/200 329/329 [==============================] - 2s 6ms/step - loss: 0.4206 - accuracy: 0.8662 - val_loss: 0.2904 - val_accuracy: 0.9175
We can plot the training loss, validation loss vs number of epochs and training accuracy, validation accuracy vs number of epochs using Matplotlib.
plt.figure(figsize=(16,5))
# History of accuracy score
plt.subplot(1, 2, 1)
plt.plot(hist8.history['accuracy'])
plt.plot(hist8.history['val_accuracy'])
plt.title('Evolution of model accuracy score by epochs')
plt.ylabel('accuracy_score')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='lower right')
#---------------------------------
# History of Loss
plt.subplot(1, 2, 2)
plt.plot(hist8.history['loss'])
plt.plot(hist8.history['val_loss'])
plt.title('Evolution of Loss (Categorical Cross-Entropy) by epochs')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['Train', 'Validation'], loc='upper right')
plt.tight_layout()
# Obtain cross-entropy loss and accuracy scores on validation dataset
loss, accuracy = model8.evaluate(X_val, y_val_encoded)
print('Validation cross-entropy Loss:', loss)
print('Validation classification Accuracy:', accuracy)
1875/1875 [==============================] - 2s 1ms/step - loss: 0.2859 - accuracy: 0.9213 Validation cross-entropy Loss: 0.2858662009239197 Validation classification Accuracy: 0.9212666749954224
To evaluate the performance of the model, we will use the evaluate() function, which returns the loss value and metrics values for the model on the test data.
# Obtain cross-entropy loss and accuracy scores on test dataset
loss, accuracy = model8.evaluate(X_test, y_test_encoded)
print('Test cross-entropy Loss:', loss)
print('Test classification Accuracy:', accuracy)
563/563 [==============================] - 0s 825us/step - loss: 0.4328 - accuracy: 0.8771 Test cross-entropy Loss: 0.4328251779079437 Test classification Accuracy: 0.8770555257797241
Again we have fairly good accuracy in both validation and test datasets for SGD
with Nesterov momentum
• Validation accuracy ~92% and Test accuracy of ~87.5%
• However, Adam still beats SGD optimizer in terms of both min loss and max
accuracy
• I tried RMSProp too and found Adam to be the best still
• I finalize the best model having validation accuracy of 91.85% (loss: 28.61%),
and test accuracy of 88.08% (loss: 41.99%)
• Model specification: Neural Net with three hidden layers, with batch
normalization and dropout layers applied after each hidden layer, hidden layer
activation function as Leaky ReLU with alpha = 0.3, optimizer being Adam with
learning rate 0.0025, model evaluated using softmax activation function in the
output layer along with categorical cross-entropy as loss function
Model evaluation¶
# Store the best model
best_model = model6
# Prepare target data for model evaluation
y_test_actual = np.argmax(y_test_encoded, axis=1)
y_test_pred = np.argmax(best_model.predict(X_test), axis=1)
# Set the figure size
plt.figure(figsize=(16, 10))
# Calculate the confusion matrix
cm = confusion_matrix(y_true=y_test_actual, y_pred=y_test_pred)
# Normalize the confusion matrix
cm = cm.astype('float') / cm.sum(axis=1)[:, np.newaxis] * 100.0
# Visualize the confusion matrix
sns.heatmap(cm, annot=True, cmap='Greens', fmt='.1f', square=True)\
.set_title('Confusion matrix - Test Data (SVHN digit classification)')
_ = plt.xlabel('Predicted')
_ = plt.ylabel('Actual')
Comments on confusion matrix above:
• Digits 0, 4 and 7 are classified most accurately (92.1%, 90.8% and 90.3%
respectively)
• Digits 8, 6, 3 and 9 are classified not so accurately - this is clearly evident from
the fact that digits 8, 6, 3 and 9 have very similar turns and edges
• Digit prediction accuracy scores for SVHN using DNN range from 84.3% to 92.1%
# Print the classification report
print(classification_report(y_test_actual, y_test_pred))
precision recall f1-score support
0 0.91 0.90 0.91 1814
1 0.85 0.90 0.87 1828
2 0.92 0.89 0.90 1803
3 0.84 0.85 0.85 1719
4 0.90 0.90 0.90 1812
5 0.88 0.88 0.88 1768
6 0.88 0.86 0.87 1832
7 0.90 0.91 0.90 1808
8 0.88 0.84 0.86 1812
9 0.86 0.88 0.87 1804
accuracy 0.88 18000
macro avg 0.88 0.88 0.88 18000
weighted avg 0.88 0.88 0.88 18000
Looking at the f1-scores, digits 7, 0 and 2 are classified most accurately
Conclusion:
• ANN / DNN predicts image classification based on distribution of image pixel
intensities
• The dataset in hand is of SVHN image - and, hence the images contain lots of
noise and is hard to discern simply by looking at the pixel intensities
• The best ANN model has validation accuracy score of ~92%; while the same
model achieved test accuracy score of 88%
• Whatever the best DNN model you apply, in case the source image contains lot
of blurriness / noise, you can't get very high accuracy unless you have
technology to detect the edges of the images.
• I think CNN is a better candidate technology for SVHN image classification
problem given its inherent complexity.
# Prepare model and fitting parameters ... [for future tests in Grid Search CV for Neural Nets]
#mdl = tf.keras.wrappers.scikit_learn.KerasClassifier(build_fn=create_model
# , verbose=0
# , validation_data=(X_val, y_val_encoded)
# , callbacks=[callback]
# )
# **********************************************************
#batch_size=[64, 128]
#epochs=[100,150,200]
#activation_fn=[[tf.keras.layers.LeakyReLU(alpha=0.3)]*3, ['relu']*3, ['tanh']*3, ['sigmoid']*3]
#optimizer = [tf.keras.optimizers.Adam(learning_rate=0.0025),
# tf.keras.optimizers.Nadam(learning_rate=0.0025),
# tf.keras.optimizers.RMSprop(learning_rate=0.0025),
# tf.keras.optimizers.SGD(learning_rate=0.0025, nesterov=True)] #'Adagrad', 'Adadelta', 'Adamax'
#param_grid = dict(batch_size=batch_size, epochs=epochs, activation_fn=activation_fn, optimizer=optimizer)
#from sklearn.model_selection import GridSearchCV
#grid = GridSearchCV(estimator=mdl, param_grid=param_grid, n_jobs=-1, cv=3)
#grid_result = grid.fit(X_train, y_train_encoded)
# ***********************************************************
# summarize results
#print("Best: %f using %s" % (grid_result.best_score_, grid_result.best_params_))
#means = grid_result.cv_results_['mean_test_score']
#stds = grid_result.cv_results_['std_test_score']
#params = grid_result.cv_results_['params']
#for mean, stdev, param in zip(means, stds, params):
# print("%f (%f) with: %r" % (mean, stdev, param))